Atlantean Ascent

"You're a Fisherman trapped in an underwater metropolis. 

Harpoon through the fish monsters and get to the surface."

TRAILER COMING SOON

Project: 7

Time: 14 weeks, half time, 280 hours

Reference Game: Remnent II

Group Name: Arcturus Game Studios

Game Engine: Aurora

Level Editor: Unreal Engine & Borealis

Role Overview: Engine, graphics, and editor

Contribution Overview: Physics, visual scripting, and updated renderer

Aurora Borealis, our own game engine and editor was used to create and play the game, and Unreal Engine was used as the main level editor by the level designers.

Atlantean Ascent is the seventh game project at TGA which is still work-in-progress. It is a third-person shooter based on Remnant II and would prove even more demanding of our engine than our previous project, Spite: The Summoning. 

The player plays as a fisherman who is lost in an Atlantean city filled with hostile humanoid fish-people. The fisherman only has his harpoon gun to protect himself with as he tries to find his way out. 

As we still had work to do on Spite, we chose to only include one type of enemy at first, a melee enemy. We planned to, if we had time, add a second, ranged, enemy. This sadly made it difficult for our level designers as they had to account for the possibility of a new enemy type being added when building their levels without being able to test it.


My Role

I, together with David Nilsson, chose to implement node scripting and a physics system, both of which are required for this project.

As the player in this game can move the camera freely, frustum culling, implemented during the last project, would not suffice. It was up to me to find a solution where large parts of the scene could be rendered at once with a descent frame time.


I also worked on our editor Borealis. We, as a group, wanted it to be in a state where we wouldn't need Unreal Engine anymore. My focus was on increasing the usability and adding quality-of-life features.

For this project, we settled on using Unreal as the level editor. Then, use Borealis to add details and decoration, fix lighting, and add gameplay features. We did this as it was difficult to get all the information needed in our engine from Unreal Engine. This workflow did lead to difficulties with iterating on the levels. If a level had to be modified in Unreal, all changes made in Borealis would be for nothing. We hope that we can solely use Borealis during our next project so that this won't be a problem.


My Contributions

A render pipeline state is an easily modifiable object that specifies how a render stage is executed, for example the geometry state which is set and prepares everything that is needed before all the meshes in the scene gets rendered. Before the render pipeline states were used we had to manually prepare everything before each stage of rendering a frame. This made the scene renderer very difficult to read and further develop as there was many states to keep track of. 

We already had both mesh and light culling but it wasn't enough. As our games got bigger and we had to be able to look at more of the scene at once, we needed to reduce the draw calls even further. Mesh instancing is when you render the same geometry multiple times in the same draw calls. I made this happen automatically for every object submitted for rendering by first culling the mesh if it's not in view, and then adding the instance data, the position, rotation and scale, to a list per mesh type. 

One of the requirements for this project was to implement node scripting so that our level designers could prototype and script the game's scripted events. The school provided us with a working node scripting solution called Munin Graph, which is based on ImGui Node Editor, as a part of our course on Node Scripting. As David Nilsson and I chose to use our own engine in the course, Munin Graph was already working as the project started. We have together with our technical artists implemented many useful nodes. 

The project was required to utilize a third-party physics engine such as PhysX, so that's what we chose. As we tried to implement PhysX, we encountered some problems, and getting the engine runnable again took a few days. Sadly, I fell sick and had to work from home with David via Discord. Fortunately, it wasn't too hard to get everything working, because in two days, we had implemented static and dynamic colliders. I later implemented character controllers, ray and sphere cast and more.

When I implemented object picking (selection) in the viewport, unfortunately I overlooked all objects without meshes which the user could not select without finding them in the hierarchy. I fixed this problem by, similar to Unity and probably most 3D engines, rendering an icon (gizmo) at the object's position and thereby making it clickable and selectable.

To make it easier to write our UI and make it look better with less code, I wrote a wrapper for ImGui. Due to us already having a lot of UIs in our editor, it has taken some time to translate into using the new wrapper.

(Pictures comparing old and new)

Before I and Dino Zvonar made this material editor, the materials settings were exposed in the inspector as a part of the mesh and skinned mesh renderer component. This made it difficult to modify materials as the inspector became quite cluttered and you had to scroll to see some settings. I helped to structure the editor, get the preview rendering to work and the ImGui wrapper I wrote is used.

(Pictures comparing old and new)

As the engine and editor keep developing with new and improved features, crashes can become very common since there is not enough time to properly test said features. When the other disciplines started using our editor more and more, they often lost up to hours of work due to a lack of saving when the engine crashed. To fix this problem, I implemented autosaving. The autosaves are stored as a separate file to the original scene file and are loaded automatically when a scene is opened. These files can then easily be deleted if you don't want to keep the changes.

The autosave system later became one of the untested and crash causing features. This was due to me not checking if the current scene format is our engine's internal scene format, and not the one we get from our level designers when they export from Unreal.

I fixed the problem, and autosaves have since saved many users work that otherwise would have been lost.

When it was time for us to add UI to our game, I wanted to make that process easier. We had for the last two projects written game-specific UI that would need to be rewritten, and since we had two more projects, including this one, I wanted to write a UI system that could be used in both. 

The users would be able to create UI elements in our editor and save them as part of the scene. Images were quite easy to implement as it's just textures displayed on the screen, but buttons took some time as I wanted to be able to define what would happen if they were pressed. I added a drop-down for predefined functions that could be executed with custom parameters. The user can also specify a scale factor and a color for the button as it is hovered. As Amanda Nilsson wanted to create the menus for the games, I showed her how to add more button functions, and she's currently working on adding sliders, drop-downs, and more.