When I was looking for something to do as my specialization, I found a presentation by Ubisoft showcasing their implementation of volumetric fog in Assassin's Creed IV: Black Flag. In their implementation, they used compute shaders and 3D textures to simulate atmospheric scattering. This grabbed my attention as it was an interesting way to learn about compute shaders, which I had never used, and looked good while being performant.
My reference (Assassin's Creed IV: Black Flag)
My plan was to get it working with just a directional light, no point lights or spotlights, due to the short time I had. Despite the time limitations, I chose to work in Aurora, the engine I have helped develop from scratch, instead of an engine like Unity or Unreal.
I chose aurora because I thought it would look good in our last project at TGA and wanted to learn more about DirectX 11.
The first thing I needed before I got started was for the engine to support compute shaders and 3D textures. David Nilsson, a friend of mine, also planned on using compute shaders, so we worked together to implement them in Aurora.
To get compute shaders, 3D textures, and structured buffers, a new kind of buffer David needed for his project, to a state where they can be created in the engine didn't take much time, but we later realized that textures needed a UAV (Unordered Access View) to be accessible in the compute shaders. As we had never used this kind of view before, we chose to take a step back and read about the different data structures and views in DirectX.
Finally we could create structured buffers, and it was time to see if anything we had worked on during the last two weeks even worked.
To test if everything worked as expected, we stored a shifting color value in a 3D texture via a compute shader, and later tinted the entire game with that color, and it worked.
(I will not show the video we took as it may cause epilepsy)
As we newly implemented render pipeline states whilst working on Atlantean Ascent we wanted to make them able to support compute shaders as well. We later seperated the pipeline states into a render pipline state and a new compute pipline state as they didn't have anything to do with one another.
It was now time for me to start working on the volumetric fog. I had looked through the presentation earlier but wanted to read through it more thoroughly and realized I would need some code examples. As mentioned, I had never used compute shaders or done any volumetrics before.
I found a Unity implementation on github that uses the same presentation as a reference, and as I read through it I realized how much work it would take to get it working.
Three shaders were needed, two of which would be compute shaders, and one being a pixel shader. The first compute shader would calculate and store information about the density and color of the fog in a 3D texture. The second would then ray march through this texture, accumulating the data. The last one, the pixel shader, would apply a tint over the already rendered scene based on the fog data.
This was my result...
My results as for now
Since then, I haven't had the time to work on the project as my personal life got in the way, and when I got back I had to focus on this website and the school projects.
I plan to keep working on this until it works. Sadly we may not get to utilize it in our last project at TGA, but I have at least learned a lot.