Reassembly Guide

how to new vegas for Fallout: New Vegas

A Guide to Shader Modding of Reassembly

Overview

This is a guide for those who know nothing about 3D graphics, OpenGL, or shaders but want to tinker with the shaders in Reassembly. It assumes you at least took a high school geometry class and know what x,y,z coordinates are, can find file locations on your drive, can edit text files, and that’s about it. Knowing how to create other Reassembly mods will be helpful, and knowing any programming language will be a plus too. But it generally assumes very little knowledge and assumes no knowledge of OpenGL. It will only focus on the simplest possible modifications like changing numbers. If you want to do anything more complicated then you’ll need to learn more about how shaders work, but this will get you started.

What are shaders?

To put it simply, shaders are tiny programs that run on your GPU instead of your CPU. (Technically they can run on your CPU but that would be incredibly slow, so that pretty much doesn’t happen in modern times.)

You don’t need to fully understand this diagram to get the idea.

Shaders are the code that runs on the GPU and what does everything on the GPU. Everything else runs on the CPU. The CPU can offload a lot of graphics processing to the GPU by placing stuff like models, textures, and shader programs in its memory and telling it to go.

Your CPU probably has 4 processor cores, maybe 2 if you’re using a lower-end computer or maybe even 6 or 8 if you spent lots of money on a really top of the line server CPU. But if you have a gaming system your GPU most likely has over 1,000 cores! Even if you have an older nVidia GeForce GTX 770, your GPU has 1,536 of them. This means it can run 1,536 copies of a shader program at the same time. Not surprisingly, this equals speed, especially compared to only being able to run 4 at one time.

If you’re using something like a laptop or an “office” type desktop that only has an Intel HD Graphics (Integrated) GPU, then you probably only have between 32 and 128 of these cores, maybe 192 or 384 if you have a rather new system. That’s why laptops and office PCs can’t handle large 3D games so well unless they also have a “real” GPU. But still, being able to run 128 copies of a shader program at the same time is much faster that what your CPU could do by itself, only running 4 or so copies at the same time.

What do shaders do?

Shaders actually do more than just shading things. The term is kind of historical because originally shaders only shaded objects to simulate lighting, etc., and pretty much only converted 3D objects into 2D pixels for display on a monitor. These days a shader is any program that runs on the GPU itself.

Shaders get input values (like x,y,z coordinates, colors, and even other stuff like the current frame number, time, or fixed parameters) and produce output values (like different x,y,x coordinates or colors).

Shaders now actually do all kinds of stuff like rotate 3D objects and scenes, or even turn one triangle into lots of smaller triangles to round off a surface and make it look more realistic. But fortunately we don’t have to worry about the latest and greatest fancy stuff that modern GPUs can do because this is a 2D game that doesn’t need anything too fancy.

Here’s a diagram that shows the rendering pipeline of core OpenGL 3.3 to give you a better idea of what shaders do.

Vertex data is just a list of x,y,z coordinates for the vertices and possibly a color for each one. The blue boxes are where shader code can be inserted, and the gray boxes are “fixed functionality” (at least in core OpenGL 3.3). Notice that before the Rasterization stage all of the data is stored as mathematical points and lines and we only get this converted into pixels after Rasterization.

More details below.

There are only two types of shaders used by Reassembly:

  • Vertex Shaders – A vertex is a point at the corner of a triangle or other polygon, just like geometry class says, or even just a lone point. Unlike geometry class though, for computer graphics they have two main attributes: position (x,y,z) and color. Vertex shaders can change either of these, but because this is a 2D game we probably don’t want to change their positions or players will get really confused. If we change anything here, it should be the something about the color of the vertex. You might be wondering what sense it makes for a single point to have a color. Well, in OpenGL at least, if you want a whole triangle for example to be red then you set all three of its vertices to red. If you set its vertices to different colors then you’ll get something like a gradient fill in the triangle on the right. This made more sense in the olden days when you didn’t have programmable shaders and a sphere looked like a many-sided polyhedron of gradient filled triangles. In short, vertex shaders modify only vertices but this can affect things like the fill color of a polygon. They can also rotate, scale, or move polygons around by moving their vertices around.
  • Fragment Shaders – A fragment is all of the information needed to render a single 2D pixel. While a pixel only has an x,y coordinate and a color, a fragment has a z coordinate (it can be in front of or behind other fragments) and can have other attributes, but if you just think “pixel” when you see the word fragment then things will make sense. Really a fragment is a pixel-sized chunk of a 3D scene that will eventually get turned into a 2D pixel (or possibly discarded if it turns out that it’s covered up by another fragment). While a triangle has only 3 vertices (one for each corner), it may have 100’s or 1000’s of fragments because it will get turned into 100’s or 1000’s of pixels (depending on how big it is) for display to our 2D monitor screen. Good thing you probably have over 1000 shader cores on your GPU because the fragment shader will be executed once for every single one of these pixel-sized fragments! Now you probably see why it’s so important to have 100s or 1000+ shader cores on your GPU. On a 1920×1080 monitor you have 2,073,600 pixels and a shader may need to run for each one. If you weren’t running 1000 or more shader copies at a time then your FPS rate would be intolerable. Anyway, for our purposes we mostly care about setting the color and transparency of fragments to do things like make special effects brighter, darker, tinted, or even create weird patterns. So in short, fragment shaders modify fragments and because there are many more fragments the fragment shaders can have much more detailed effects.

Even though this is a 2D game that doesn’t do 3D-style shading, just to explain further I’ll show you this and explain how it was created by a vertex shader and fragment shader:

First, a vertex shader was run on each vertex of the sphere to move it around and scale it to match the camera angle and stuff like that. (All of the vertices in the sphere have the same color and were not changed.) Then, for every pixel (fragment) the fragment shader was executed to figure out what color each pixel should be based on a number of factors. Some fragments that fall on the glowing red grid were just told that their color should be a bright red and that was it. Other fragments had their normal (direction that the surface is facing) compared to the angle of the light to determine how brightly lit they should look. And they also had their normal compared to the angle of the light and the angle of the camera to determine if they should be affected by that shiny spot (specular reflection or “glare”) and how that should affect their color. Fragments for the other side of the sphere were discarded or never even generated because it was determined that they weren’t visible (because they’re covered by other opaque fragments). No texture is mapped onto the sphere, so this shading is all just determined by math in the fragment shader.

Fortunately we don’t need to do anything this elaborate for Reassembly because 2D games, even when they use 3D graphics systems (which most now do, because it’s faster than doing things on the CPU), simply use quads (rectangles or squares) or tris (triangles) and place them on the same flat 2D plane with the “camera” facing down on it. So we have nothing but simple polygons like quads all facing directly “up” at us like a bunch of cut up pieces of construction paper laying on a table in a kindergarten class. The closest we get to 3D is the need to keep track of which pieces of paper are laying on top of what other pieces of paper.

Games that use bitmap images for sprites (such as game characters etc) just create quads, then map a 2D texture (with transparent background) onto the quad. Mostly x,y coordinates are used but the vertices of polygons are still given a z coordinate only to indicate what should be drawn on top of what. (Background scenery objects would have negative z coordinates, and things in the foreground like characters and mobs would have zero or positive z coordinates since they should be drawn on top of the background at least.)

Shaders in Reassembly

Reassembly uses a graphics system called OpenGL. (This is like DirectX, but it’s an open standard that isn’t vendor locked to Microsoft.) This is important because it tells us that shaders will be written in GLSL (OpenGL Shader Language).

Luckily we don’t need to know the whole GLSL language because we’re just going to copy a shader and make some really simple modifications like multiplying things to make them bigger, brighter, or change colors around.

Reassembly mostly uses vector graphics rather than bitmaps. So most things in Reassembly are filled polygons where the vertexes have x,y coordinates and use z coordinates just to indicate what should be drawn on top of what.

Some things like the shields are quads with a transparent sphere texture mapped to them and a positive z coordinate to indicate that they should be drawn on top of other things (like the ship that’s generating them). Other things like ship blocks are just simple polygons, then shaders are used to create special effects like the shimmering block fill effect by varying the vertex colors somewhat.

Luckily we can do some interesting things without understanding too much of this.

The shaders are mostly located in a file called shaders.lua. This is the file we need to copy and modify.

Setup for Shader Mods

The first thing you need to do is create a difrectory for the new mod. You’ll know how to do this already if you’ve created other mods; this is the same, just with a different file in the mod directory.

More details on the directory setup for mods can be found here[reassembly.wikia.com].

  1. Make sure the game isn’t running.
  2. Depending on your OS go to:
    • In Windows: C:Users[your username]Saved GamesReassembly
    • In MacOS: /Users/<You>/Library/Application Support/Reassembly/
    • In Linux: /home/<You>/.local/share/Reassembly/

    and create a directory called mods.

  3. Inside that directory, create another directory using the name of your mod. The example I’ll be using makes stars larger and turns them all red. So name the mod “Bloody Stars” or something similar.
  4. Now, find the Reassembly game directory. It may differ depending on which steam library you installed Reassembly in, but it should be something like C:Program Files (x86)SteamSteamAppscommonReassemblydata. (Again this differs for MacOS and Linux but I’ll find the info and add it later.)
  5. Copy the shaders.lua file to the “Bloody Stars” directory you created above. Make sure to copy it and not move it!
  6. Use a text editor like Notepad++ to open your mod’s copy of shaders.lua. Now we get to the editing.

cvars

Note that in addition to modifying shaders there are also some things you can do simply by changing some cvars. Look through C:Users[your username]Saved GamesReassemblydatacvars.txt to find them. Here’s an incomplete list of the ones related to drawing operations:

# Shader/Glow/Halo-related cvars # kBeamGlow = 2 # kBeamHalo = 8 ### kBloomBlocks: bool: Enable bloom for blocks # kBloomBlocks = 1 # kBloomBlurRadius = 3 # kBloomBrightness = 1.5 ### kBloomIntensity: float: Bloom blurry fraction # kBloomIntensity = 0 ### kBloomRadius: int: Radius of bloom blur # kBloomRadius = 32 ### kBloomResFactor: float: Effects render target size as fraction of screen pixels (low blurLevel only) # kBloomResFactor = 0.75 ### kBloomScale: float: Bloom render target size as fraction of screen pixels # kBloomScale = 2 # kBloomTonemap = 1 # kBlurFactor = 0.5 # kBlurMenuRadius = 15 # kBlurMinDepth = 150 # kCleanBackground = 0 # kCommandHaloDeadliness = 4 kCommandHaloSize = {10,1} # kGPUDebugEnable = 0 # kOpenGLDebug = 0 # kParticleCountMax = 524288 # kParticleExplosionAlpha = 0.2 # kParticleExplosionColor0 = 0xff5500 # kParticleExplosionColor1 = 0xee2200 # kParticleExplosionTime = {0.3,0.7} # kParticleFireAlpha = 0 # kParticleFireColor0 = 0xff6500 # kParticleFireColor1 = 0xee1200 # kParticleFireRate = 45 # kParticleFireSize = 1.75 # kParticleFireTime = 0.5 # kParticleSmokeAlpha = 1 # kParticleSmokeColor = 0x80202020 # kParticleSmokeFireArcRange = {100,1000} # kParticleSmokeFireTime = {0.7,1.2} # kParticleSmokeFireVelocity_MeanStdev = {400,200} # kParticleSmokeRate = 30 # kParticleSmokeSize = 3 # kParticleSmokeTime = 3 # kProjectileGlow = {7,3} # kProjectileHalo = {15,13} # kProjectileParticleAlpha = 0.75 # kProjectileParticleSizeCoef = 2 # kProjectileParticleSparkleEvery = 0.1 # kProjectileParticleThreshold = 2 # kProjectileParticleTimeCoef = 0.25 # kProjectileParticleVelocityCoef = 0.75 # kResourcePickupParticleCount = 3 # kResourcePickupParticleSize = 10 # kResourcePickupParticleTime = 1 # kWorleyColorRadius = 1500 # kWorleyMaxIterations = 1 # kWorleyShaderCount = 4 # kWorleyShaderMaxPoints = 32

shaders.lua File Format

Before doing anything else, lets talk about the format of the file so you know what you’re looking at.

The lines starting with — are comments that are not parsed when the file is read.

The file consists of a list of named items starting with { and ending with a } at the end of the file. It basically parses like { ShaderName = { shader stuff }, AnotherShaderName = { stuff }, … }.

Each named shader entry (like ShaderStars) consists of a list of three strings like this:

ShaderName = { “string1” , “string2” , “string3” }

The first string (string1) is “COMMON_HEADER” like it says in the comment at the top. You don’t need to worry too much about this, but just to explain, this gets appended to the top of both string2 and string3. You shouldn’t need to ever edit this.

The second string (string2) is the Vertex Shader part of that particular shader program (a vertex shader + a fragment shader = a shader program).

The third string (string3) is the Fragment Shader part of the shader program.

Each of these strings begins and ends with a ” and contains GLSL code. The strings are separated by commas.

As you can see there are a lot of different shaders used for different things. Yes, all of these strings are separate little computer programs contained in quotation marks. These all get read by the game and sent to the GLSL compiler (which you don’t need to install because it’s actually part of your graphics drivers). But there’s no standard place to put GLSL code, so different games and programs store it in different locations. Some games store every program in a separate file, but Reassembly stores them all as different strings in this one file. They could have been stored in the C++ source code itself and compiled into the executable, but that would have made them nearly impossible for you to modify effectively.

Creating “Bloody Stars”

Note: We could actually do this whole thing just by changing cvars since it’s so simple and the cvars give us the needed options, but I need a simple example I can use to demonstrate shader modifications so we will modify the shader code instead.

We only want to modify one shader, so delete everything except for the first { in the file, the last } in the file, and everything in ShaderStars. After you do that, your whole shaders.lua file in your mod directory should look like this:

Now you have a shaders.lua file that is edited to only replace this one shader when the mod is loaded. That way your shader mod will be more compatible with other shader mods that only modify other shaders but not this one. (Shader mods that modify the same shader program, such as two mods that both modify ShaderStars, will be incompatible since one mod will overwrite the other when they’re loaded.)

Modifying the Shader

In this case we have a somewhat unusual shader that, rather than operating on a polygon consisting of multiple vertices, operates on a single vertex that is given a “size”. This has the effect of creating a square centered on the vertex which is nice and simple for things like stars and particle effects. This invisible square will act like a mini drawing canvas for our star, with the drawing part being done by the fragment shader.

According to the GLSL documentation, the length of the sides of this square are defined by this gl_PointSize variable and I think it’s probably too small so lets make it bigger.

The second string here is the vertex shader so lets take a look at that. This line looks interesting since it sets gl_PointSize:

gl_PointSize = ToPixels * Size / (EyeZ – Position.z);

How does this calculation work? I don’t know and I don’t really need to care. All I know is that I want it to be bigger! So lets just multiply the whole thing by 3. (A * means multiply in most computer languages, and this includes GLSL.) So change it to this:

gl_PointSize = 3 * ToPixels * Size / (EyeZ – Position.z);

Ok, so good there. Whatever size the game wanted the star to be, now it will be three times bigger!

Understanding the Math

I said I didn’t really care how this worked and we don’t really need to know just to make eveything three times bigger, but lets break this little formula apart since doing other more complex things will require some analysis and this will be educational.

ToPixels appears to be a scaling factor to convert between vector coordinate units and pixels. We don’t know exactly what this value is, but lets just say that there are 5 pixels between line coordinates 0.0 and 1.0. This value would be 5.0 in that case.

Size appears to be a star size in vector coordinates and we need it converted to pixels? So that’s why Size is multiplied by ToPixels. If the size were 1 and the ToPixels were 5 then that would convert the vector coordinate size of 1 to a pixel-defined size of 5. If Size were 3 then it would get converted to 15, etc.

EyeZ would appear to be the z coordinate of the camera. Away from the screen and toward the viewer is the positive z direction or +Z, and it would seem that the game is implementing the zoom level as higher +Z values for the “camera” (eye) when zoomed out (further away from the plane at z=0.0 where the ships are) and lower +Z values when zoomed in. So EyeZ should be some positive number indicating how many vector coordinate units the camera is above the plane with the ships at Z=0.

Position.z indicates the z coordinate of our vertex. Recall we just have a single vertex for stars and that Position is a vector. When you have a vec3, which is like a tuple of numbers, the first element can be referred to as Vectorname.x, the second element as Vectorname.y, and the third element as Vectorname.z. (Because vectors are also used for colors, the first element can also be referred to as .r, the second as .g, and the third as .b. More generic languages would have you refer to these as 0, 1, and 2, but since GLSL is graphics-specific and not a general programming language, it maps typical graphics programming variables like x, y, z, w and r, g, b, a to elements 0, 1, 2, and 3 in vectors.) Anyway, our stars can actually be different distances away in the background, away from the z=0 plane of the ships. This makes it easy to do the nice parallax scrolling effect for stars and changes their size to match the distance. Since they are behind the z=0 plane, they will have negative values aka be off in the -Z direction.

So EyeZ will be positive, and Position.z will be negative. So when we do EyeZ – Position.z we get something like 10 – (-10) = 10 + 10 = 20 (because subtracting a negative value is the same as adding the positive value) which would be the total distance between the camera and the particular star. So think of EyeZ – Position.z as distance from the camera.

So ToPixels * Size is the total pixel size of the star, then we divide that by EyeZ – Position.z which is the distance that the star is away from us. Therefore we get proportionally smaller stars depending on how far away they are from the camera!

If we were to remove this math then stars wouldn’t get scaled to smaller sizes when they were further away, and zooming in wouldn’t make them bigger, which would look bad.

Back to Editing

The third string is the Fragment Shader. This line is interesting because (according to the GLSL docs) gl_FragColor determines the final color of the fragment/pixel:

gl_FragColor = DestinationColor * vec4(1.0, 1.0, 1.0, alpha);

Colors are expressed as R,G,B (Red, Green, Blue) or R,G,B,A (Red, Green, Blue, Alpha).

The vec4 is a GLSL type that incidates a vector with four elements. If you’ve taken at least pre-calculus classes then you’ll know what a vector is but if not just think of it as a list of numbers with a specific length.

DestinationColor appears to be the color of the star that is set by the game, but I don’t want this used. I just want all stars to be blood red. So I’m going to change this to remove the multiplication by the DestinationColor that the game wants and just force my own color. So change it to this:

gl_FragColor = vec4(1.0, 0.0, 0.0, alpha);

1.0 as the first value means “100% red”. If we wanted 50% red then we’d use 0.5.

The next two numbers are green and blue, and we set both of those to 0.0. Note that you can’t just type 0 here, you MUST type 0.0 because GLSL demands that these be floating point values (numbers with a decimal point). If you were to try to type just 0 for these then it will interpret that as an integer 0 and you will get an error called a type error. Like C and C++, GLSL is a language that’s very picky about the type of a value. 0 or 1 are considered to be integer types, and 0.0 and 1.0 are considered to be floating point (float) types even though the average human would say 1 and 1.0 are the same thing. But GLSL wants vectors to have floating point values because that’s what those shader cores are designed to use, and if you give it something that looks like an integer it will assume you are confused.

The last (4th) number is the alpha (transparency) value which is being set from a variable named “alpha”. If you look at the previous lines of code you may notice that this is being calculated based on the distance that the fragment is from the center of the square, with the alpha being lower (more transparent) the further the fragment is from the center. Remember, this fragment shader will actually run many times for each star, once for each pixel that makes up the star. So we’re telling it all pixels will be red, but letting it change the transparency of the pixels depending on how far the pixel is from the center. Fragments so far away that they have an alpha of 0.0 (fully transparent) are discarded. The result is a fuzzy circle, and this demonstrates how the fragment shader is used to “draw” something simple (like a fuzzy circle) into an otherwise invisible square.

Anyway, if you want you can later change the color to blue, or yellow (use 1.0 for both red and green for yellow) later but for now just modify as shown above. You might even try to come up with your own math for alpha to create something more diamond-shaped (like seperately calculate the distance from the x and y coordinates and add each distance/2 together). Paying attention in geometry class will be useful here for giving you ideas.

Here’s what the file should look like after these edits:

Test Your Mod

Ok, so make sure you saved the shaders.lua file and run the game. You should see something like this:

Woohoo! A bunch of big evil red looking stars everywhere! That’s exactly what we were going for.

Now the only problem is we have no preview image for our mod, but we can create one rather easily. Take a screenshot, then use something like GIMP to crop out a square area and scale it to 200px by 200px. Then export this as a PNG file called “preview.png” and save it in your mods/Red Stars directory.

Quit Reassembly and load it again, then click on the Mods button. Your mod should now appear like this:

(In this case I just cropped out some red stars from that previous screenshot.)

Now you can easily publish it to the Workshop or whatever you want.

More About GLSL

Notice that Reassembly uses GLSL v1.2 which is an older version of GLSL, so things you read about GLSL v3.3 or later may not work in GLSL v1.2. Here is some more detailed information about GLSL v1.2:

If you want to learn more about GLSL shaders using more modern GLSL v.3.3 there’s an excellent tutorial here[learnopengl.com] but it isn’t really necessary to know too much for simple changes. Here are some tips on what to try and what to avoid:

  • Most of the variable names are determined by the C++ code of the game, or they’re defined in the GLSL itself, and are not built-in special names in GLSL. As for what they do, your guess is as good as mine! If they don’t start with “gl_” then you pretty much have to guess what they’re for based on the name.
  • At the top of each shader are lines like “attribute vec4 SourceColor;” and these are names of variables supplied by the game. You can’t change these since they need to match what’s compiled into the game code! So don’t try or you’ll just break stuff. (“attribute” means that this is an input to the shader.)
  • You will also see some lines like “varying vec4 DestinationColor;” and these are the names of variables that are outputs from the Vertex shader that are interpolated inputs to the Fragment shader. I would recommend against trying to change these unless you fully understand GLSL and know what you’re doing.
  • There are some other lines that start with “uniform”. Don’t change these lines either; uniform means that they’re parameters from the game that will be the same for every vertex or fragment for that primitive or drawing operation, but they’re set by the game code so you can’t rename them etc or you’ll just break stuff.
  • Everything you can change (possibly) will be between the “void main(void) {” line and the ending “}”. This indicates the main shader function which is why it’s called main(). These always need to, before they reach the end, set something like gl_FragColor or other varyings.
  • You can define your own variables inside main().
  • Words like float, vec2, vec3, and vec4 indicate the type of a variable. float is floating point, and vec2, vec3, and vec4 indicate vectors of 2, 3, or 4 floats. So if you see “attribute float Size;” it means Size is an input from the game, and it’s a single floating point value. “attribute vec4 SourceColor;” means that SourceColor is an input to the shader from the game and it’s a vector of four floating point numbers. (Pretty much all numbers will be floating point.) Learn more about this from the GLSL tutorials above.
  • Variables can’t be set to anything that isn’t the same type so if DestinationColor is a vec4 then trying to do “DestinationColor = 1.0” will cause an error because 1.0 is just a single float, not a vector. But you could do “DestinationColor = vec4(1.0, 1.0, 1.0, 1.0)” because then the types match.
  • Lines starting with // are comments and are ignored. Often these are alternatives that manylegged decided not to use, or are old versions of calculating things that he replaced with newer ones.
  • Mostly what you can do with these unless you really figure out all the math is to multiply * divide / add + or subtract – various things.
  • Try changing numbers around to see what happens. You can always change things back.
  • For colors, vectors are always R,G,B or R,G,B,A. For vertices/points they’re always X,Y or X,Y,Z or X,Y,Z,W (don’t ask what W is, it’s complicated and involves perspective division which isn’t used in this game.)
  • Since R,G,B,A and X,Y,Z,W are standard names with standard ordering in computer graphics, you can refer to these specific values in a vector using the dot notation, that is if you have something like vec4 myvec = vec4(1.0, 2.0, 3.0, 4.0) then myvec.x or myvec.r will both refer to the first element which in this case is 1.0. myvec.y or myvec.g refer to the second element (2.0 here). Basically first = .x or .r, second = .y or .g, third = .z or .b, fourth = .w or .a. Whether you should refer to the first element, etc, using .x or .r depends entirely on whether the vector is meant to contain a coordinate or a color. (Doing something like mycolor.x = 1.0 will work to set the first value, but looks dumb since “x” is not a color component. If you do that then people will wonder what you’ve been smoking even if the GLSL compiler doesn’t care.)

Hopefully this provides a good start. I will try to document more game specifics below.

Reference: Available Shaders in Reassembly

Here’s my attempt to create some reference information for each of the shaders used in the game. It’s not up to date with what is currently the “beta version”.

Implicit Parameters

There are some parameters that each shader gets even though it isn’t defined in the shader code.

  • attribute vec4 Position – The original position of the vertex before transformation. It could be used as a way to get different results for different vertices in the same polygon (see the example at the end of ShaderIrridescent).
  • uniform mat4 Transform – This is a transformation matrix that will rotate, translate, and scale objects when it’s multiplied by a vertex vector. This would be used to scale everything to match the zoom level, for example. You will notice almost all vertex shaders contain the line
    gl_Position = Transform * Position;

    and this is where the position of vertices are modified according to the transform matrix supplied by the game. Most vertex shaders need to contain this line and if you modify it you will change the size/positions of things. You probaby can’t change this without screwing things up so that collisions don’t work properly and such, so you probably want to leave that gl_Position assignment alone.

  • uniform float Time – This is really useful and can be used to cause things like colors and such to change over time. It’s the number of seconds since some time. Whether it’s time since the game started or the time since the century started doesn’t matter too much because you can always limit it to a specific range using something like
    mod(Time, 5)

    which will restrict it to a range of 0 <= x < 5 for example. You coud also limit it to a range of -1 < x < 1 in a sinusoidal fashion simply by using

    cos(Time)

    Time might not be available in all shaders, I’m not sure. ShaderIrridescent shows how it can be used but it can probably be used elsewhere.

ShaderIridescent

This is used to shade the blocks used for spaceships. The default shader is just a vertex shader and the fragment shader simply sets gl_FragColor (the pixel color) to DestinationColor. (Since DestinationColor is a varying it will get interpolated between the vertexes before it is passed to the fragment shader, meaning that if the fragment shader just assigns it to gl_FragColor unmodified then it will result in a gradient fill type effect if the vertices happen to have different colors.)

SourceColor0 and SourceColor1 seem to be two colors to be used for the shading that are based on the original block color. TimeA seems to be some sort of time offset, probably to make sure each block gets a slightly different shading value (otherwise they’d all “shimmer” in sync with each other. This part:

float val = 0.5 + 0.5 * sin(0.5 * (Time + TimeA));

sets val to a value between 0.0 and 1.0 (the output of sin() will be from -1 to +1, then it’s multiplied by 0.5 resulting in -0.5 to +0.5, then 0.5 is added so you get 0.0 to 1.0. Then this

DestinationColor = mix(SourceColor0, SourceColor1, val);

uses the mix() function which mixes two colors according to the ratio in val. So if val is 0.0 only SourceColor0 is used, if it’s 1.0 then only SourceColor1 is used, and if it’s 0.5 then 50% of SourceColor0 is used and 50% of SourceColor1 is used. Similar with anything in between.

All the vertexes get set to the same color meaning the whole block or polygon will have the same fill color. If you wanted to get something like a gradient fill then you might try something like this to cause different values to be generated for different vertices:

float val = 0.5 + 0.5 * sin(0.5 * (Time + TimeA + Position.x + Position.y));
ShaderColorLuma

This is used for drawing projectiles, lasers, shields, and some other effects like halos. By default it doesn’t do much except multiply SourceColor by Luma in the vertex shader. You can use the z coordinate (Position.z) to differentiate between different types of effects like this:

void main(void) { float light; // z == 1.0 will primarily affect the command module. // Thanks to Uiharu_Kazari on Steam if (Position.z == 1.0 && SourceColor.a == 0.0) light = 0.2; // try to match shields else if (Position.z >= 1.0) light = 1.0; else light = 10.0; // massive glow on lasers and weapons DestinationColor = light * Luma * SourceColor; gl_Position = Transform * Position; }

But you can probably come up with some other creative ideas for this one too. I’m not sure exactly what draw mode is used for these but it’s probably different ones.

ShaderStars

This is, not surprisingly, used to draw stars. The stars only have one vertex that is drawn using GL_POINTS mode. In this mode, points are drawn as squares with sides of length gl_PointSize centered on the vertex located at Position (rather than having four vertices like a normal quad). The attributes Size and SourceColor are the size and the color of the star that the game wants the star to be, but you can change/scale or disregard those as desired. ToPixels seems to be a scaling value to get coordinates to match pixels and EyeZ seems to be the zoom level expressed as a z coordinate (stars should get smaller when zooming out).

Note that there are also some cvars that affect the drawing of stars:

# kStarColor = 0x90ffffff # kStarColorVariation = 0.3 ### kStarInvDensity: float[50000 – 36000000] # kStarInvDensity = 2000000 ### kStarLayers: vector<float3>: defines tiled starfield layers {size, max_depth, min_depth} # kStarLayers = {{10007,3000,1000},{20149,7000,2000},{28697,20000,4000}} # kStarNoiseScale = 0.001 ### kStarSizeRange: float2[{4, 4} – {200, 200}] # kStarSizeRange = {30,30}
ShaderWormhole

This is for drawing the wormholes. I haven’t dug into this one much. Notice that it includes a file noise3D.glsl for some of the code. This is one of the more complex ones as it uses noise effects and rotation to get what whirlpool effect.

ShaderResource

Draws the little blobs of “R” that things drop. This one also contains some complex special effects and includes noise2D.glsl to help with that. I haven’t dug into the details of it yet.

ShaderColorDither

I’m really not sure how this one is being used so your guess is as good as mine.

ShaderTexture

This is supposed to be a passthrough shader for drawing textures. I’m unsure exactly where it is used. Since it’s supposed to be passthrough it should probably just be left alone.

ShaderTextureWarp

Similar to the previous one but seems to use noise2D.glsl to apply some special effect to textures. I’m not sure where this is used.

ShaderTonemap

HDR type postprocessing. There’s a whole blog post about this here[www.anisopteragames.com]. The main point of this is to make overlapping or intense colored particles turn white in their hottest regions to make them look brighter and more intense.

ShaderHsv

This seems to exist for the sole purpose of converting HSV (Hue, Saturation, Value) colors to RGB colors. It should probably not be modified.

ShaderTextureHSV

Similar to the previous one but for HSV textures. Again, probably shouldn’t modify this one.

ShaderWorley

Draws part of the halo around spaceships. I haven’t dug into the details of this one yet.

ShaderParticlePoints

For drawing particles in GL_POINTS mode. So like ShaderStars it also has gl_PointSize. Will add more later after I experiment with it more.

ShaderBlur

This implements something like a kernel filter blur for full screen blurring (such as when you start the game). I’d say there’s not much point to modifying it but someone will probably prove me wrong.

Credits

The shader pipeline diagram and the triangle screenshot are from Joey de Vries () at LearnOpenGL[learnopengl.com] used under the terms of the CC BY-NC 4.0 license.

Questions?

There’s a lot to know about shaders and I really only provided enough information to allow some basic experiments.

If you have any questions, just leave a comment below.

Also try the Reassembly Discord modding channel at [link]

SteamSolo.com