Game Design, Programming and running a one-man games business…

Coding the GSB2 Radiation effect (directx9 C++)

I was never 100% happy with the radiation effect in Gratuitous Space Battles 2, so I’ve coded a better version for the next patch. Here it is in very short silent video form:

I thought maybe some people may be interested theoretically in how it is done. Now I’m sure if this was in 3D in unity there is already a 99c plug-in that does it without you even having to understand coding, but thats not how its done so here we go…

The first thing to remember is that the spaceship is not a 3D mesh at all. its a series of sprites rendered to different render targets and then composited. That means that I’m not wrapping a texture around a mesh, but drawing one sprite on top of another, but cropping the final output to the alpha channel of the base (ship outline) sprite.

Before I learned much about shaders, I had a system that did this using mostly fixed function, which I use here, but build on with better shaders to do more stuff, making it a hybrid approach and fairly complex :D. To simply draw one sprite on top of another, but use the alpha channel from the bottom sprite, I use a custom vertex format that has 2 sets of texture co-ordinates. One set is from the splatted image ‘radiation’ and the other is from the alpha channel of the ship outline.

It gets a bit fiddly because where I’m drawing the splat sprite could be anywhere on the ship, so I need to work out the offset and dimensions of the splat in relation to the ship image, and store that in the second set of texture UVs, and pass that into a shader.

The shader then simply reads the color data and alpha data from the radiation splat, and multiplies the alpha by the alpha at that point on the base ship texture, and voila, the image is cropped to the ship.

But thats a very simplistic explanation because there is more going on. To make things fast, all of my radiation gets drawn together (to minimize texture / render target / state change hassle), so its done towards the end of processing. In other words, there is no painters-algorithm going on and I need to check my depth buffer against this ships Z value to see if the whole splat is obscured by some nearby asteroid or debris. That requires me passing in the screen dimensions to the shader, and also setting the depth buffer as a texture that the shader can read.

So that gets us a nicely Z-cropped image that wraps around the ship at any position. It also fades in and out so we need to pass in a diffuse color for the splat sprite and calculate that too. We also have to do all this twice, once for the splat, and once for the slightly brighter and more intense lightmap version, allowing the radiation effect to slightly light up areas of the ship hull during final composition.

At this point the shader is a bit complex…and there is more…

I wanted the radiation to ‘spread’, and to do that, I need to splat the whole thing at a fixed size, but gently reveal it expanding outwards. To do this I create yet another texture (the spread mask) which also gets passed to the shader. There were various ways to achieve this next bit, but what I did was to ‘lie’ to the shader about the current UV positions when sampling this spread mask. Basically I calculate and pass in a ‘progress’ value for the spread, and I use that, inverted, to deflate the UVs of the source image (which is CLAMPed). So effectively my first UVs are -4,-4,4,4 and the spread splat is a small circle in the center of the splat, expanding outwards to full size.

Because I’m only doing this when I sample the alpha channel from the spread mask, the color and alpha data from the base ‘splat’ texture remains where it is, so it looks like its a static image being revealed by an expanding circular (but gaussian blurred) mask.

I’m pretty pleased with it. The tough bit I’ll leave for you to work out is how that works when the source radiation image is actually a texture atlas of different radiation splats :D

Fun fun fun.

 

 


4 thoughts on Coding the GSB2 Radiation effect (directx9 C++)

  1. I’m confused… why do you need to do manual depth buffer checks in the shader? On OpenGL you’d just write glEnable(GL_DEPTH_TEST), doesn’t DirectX have something equivalent?

    1. Yeah…its not as simple as it sounds, but basically I have my own depth buffer, I’m not doing conventional z-buffer stuff because I have soft-alphaed edges on my sprites and don;t use a traditional accept/reject z-sorting algorithm.
      When you have meshes and a pixel is either on or off, z-buffers are great, but when you are layering sprites, especially with alpha, you get horrid lines and jaggies.

  2. You could do a (soft) threshold test against a radial gradient mask to do the reveal. This would give you 1) non-circular reveals (if desired), and 2) consistent soft-edge width (the scaled mask is sharper at the start than the end).

    1. You mean replace the final texture lookup with code inside the shader? I did consider that, but I liked the flexibility of visually composing the gradient in photoshop.,

Comments are currently closed.