Spite: The Summoning

"A powerful demonic presence has been summoned in the catacombs of an old town. You, the cleric, must descend deeper to the source of the evil and use your divine magic to vanquish the demons that have emerged."

Project: 6

Time: 12 weeks, half time, 240 hours

Reference Game: Diablo III

Group Name: Arcturus Game Studios

Game Engine: Aurora

Level Editor: Unreal Engine & Borealis

Role Overview: Engine, graphics, and editor

Contribution Overview: Debugging- and profiling tools, editor and runtime

Aurora Borealis, our own game engine and editor was used to create and play the game, and Unreal Engine was used as the main level editor by the level designers.

Spite: The Summoning is a game based on Diablo III and was our sixth project at TGA. As our last project, Deliverance, was meant to be more of a tech demo and an opportunity to develop the engine, this project would truly put our engine to the test. 

The player plays as a cleric, battling demons while descending deeper into the catacombs to defeat a source of evil, a boss based on Belial, the lord of lies.

We chose to have two enemy types other than the boss. The first enemy is small, with lower health. The second is bigger, stronger, and can damage the smaller enemies to regain health.


My Role

My role in this project was engine developer, primarily graphics programmer. As we only had short time to make the rendering work during the last project, Deliverance, we did not have the time to build a system which would support bigger scenes which were needed for this project. 


We also felt the need to create a simple editor in which we as programmers easier could debug and test new features and the other disciplines could test their assets, scenes, or animations without our help. As I've already made an editor for my own engine Epoch, I began working on our editor Borealis and have helped to maintain, optimize and add new features to as I kept working on the engine. 


Despite us working on an editor, it's not in a state where one could easily make an entire game in it. Due to this, Unreal Engine is still our primary level editor, but we can create, build and save scenes trough Borealis, which is useful when testing and debugging.



My Contributions

We all wanted a way to visualize and debug stuff in the editor, which is why I implemented line rendering. I wrote functions to draw primitive shapes from lines that later are rendered in a single draw call. I also made it possible to pick a color for the shape. This is used to for example visualize light sources, colliders, and camera frustums. 

During the development of Deliverance, I found it hard to pinpoint where our frame time was spent, and visuals studios built-in performance profiler did not give me as much control as I needed sometimes. That's why I wrote my own profiler tool. The profiler allows the programmer to specify which functions and/or scopes to profile with a macro, and the results can later be inspected using perfetto.dev.

Due to the short time we had during the development of Deliverance, our game object and component system weren't the greatest. I helped to rewrite it and with that, the scene system. It was now easier to create a scene, add game objects to it, and add components to the game objects. As this rewrite had to be done quickly since the others were dependent on it, we made a mistake we wouldn't notice until much later; Every time we tried to get a specific component, a component of that type was created. 

Now that the game was going to be bigger, it would need more assets. We wanted to be able to move these assets in our asset folder if needed, which wasn't supported due to all references to an asset being by file path. What I and Dino Zvonar did to fix this as fast as possible, since our group depended on a working asset system, was to use UUIDs to reference an asset. A UUID is a unique number that we store in a registry with a corresponding asset path. Now, if we need to move an asset outside of the editor, we could just change the path in the registry and everything just works.

As the scenes grew in size and complexity, we noticed a significant increase in frame time, lowering the frame rate. We suspected the reason to be the rendering, especially the rendering of shadows. To render shadows, meshes have to be rendered again per light source. This can result in a drastic increase in draw calls if there are many lights. To fix this we needed to render only the necessary geometry.

We started with frustum culling, sorting out the meshes not in view. Then we iterated through all remaining meshes per light source, only rendering the meshes in reach of the light source. This caused some problems as big meshes just out of view wouldn't cast shadows causing shadows to "pop" in and out. By using the unsorted list of all meshes instead, we fixed the problem, but the frame time increased as we had to iterate over more meshes per light.

I created a new project which would be our editor. It is via the editor we build and test our game. For us to do that, the scene has to reset to the state it was in before entering play mode. This is done by copying the original scene when entering play mode, where we then play the game in said copy, never altering the original one.

I also made it possible to create and open projects in our editor. Projects have seperate project settings and a seperate asset folder. Projects has been used in this project, and Atlantic Ascent, and by Ozzy Joelsson during his specialization. 

We had to be able to play the actual game without playing it via the editor. I created a runtime project, which acts like the editor in play mode. The reason why I made it into a separate project from the editor is so that it is easier to develop due to it being less cluttered with editor-only code. But if we need to add something both in runtime and in the editor, we need to do it in two places, which can take some time.

I added object selection by clicking in the viewport together with Dino Zvonar. When the meshes gets rendered, I render to a seperate texture, the same mesh in a color based on an unique ID. When we then click in the viewport, a color is sampled from the ID texture and used to identify which game object to select.

Due to this game having bigger environments than our previous game, we could not use the same technique of rendering shadows over the entire scene, as that would be very performance-intensive. We instead had to render shadows only where the player would see them, in front of them. This took some time to get right but was easy enough due to the camera angle in the game not allowing the player to see far into the distance. Because we didn't need to support distant shadows, I didn't implement shadow cascading as I initially planned to; instead, I settled for what we had, basically the first cascade.