In the first part of this series I talked about the Renderer and what you need to have for a simple Game Engine for the kind of games a solo hobby dev will be making.
In this article I’ll go through my own engine’s parts and explain all the systems and what is involved in them. Hopefully, this will provide you with some inspiration or at least some knowledge, so that you can start on your own.
This is what my engine looks like.
Now I’ll go through each of these components and it’s sub components and explain what is happening in each one.
This is the underlying hardware on the computer running the game.
3rd Party SDKs, APIs, and Libraries
The SDKs and APIs are things that I can’t do without. The Windows API is not something that you can code yourself. The OpenGL API and the OpenAL Soft library are others that I don’t want to handle such low-level code (and likely couldn’t). The C++ Standard Template Library, FreeType2, and Ogg/Vorbis are all libraries that are an extremely massive undertaking to try and code myself, so I don’t.
One day I would be interested in having a crack at some of these items, but they’re all very low-level stuff that would take me years of development to come close to.
The more third party SDKs and libraries you include, the less code you need to write for yourself. So you need to decide here how much of this lower-level stuff do you want to do? Personally, I found lots of joy doing these parts; but it also prevented me from making any games a lot earlier than I could have otherwise.
Loaders and Wrappers
I haven’t used an open-source OpenGL loading library, instead I’ve loaded and set all my own defines as according to the OpenGL Specification. I did this because I was interested in how it worked, I also wanted to keep reliance on 3rd party code to a minimum.
I’ve also wrapped a lot of common Windows API-based tasks into a small series of wrapper functions. The Windows API usually involves a bunch of C-like structs that you need to populate, but most of those values will be static for my purposes of opening a window or getting .png data from a file; so I wrapped it all up.
This has the added benefit of allowing me, in the distant future, to support other Operating Systems. Just use the pre-processor to switch out for different OS’s within these wrapper functions.
You see the Messaging system spans across several of the other systems. That’s because the Messaging system is implemented as a static class. Any system can access it.
Access to the Messaging system allows a component to:
- Register a new
MessageType, which is an identifier that other systems can use to listen to these “types” of messages. For example, a “ball bounce” might be a message. Or “player dead” might be a message.
Other systems can only make use of the “ball bounce” message if they’re aware it exists.
- Create a new
Messageof any type
- Post this
Postmaster(which is the static class)
- Subscribe to
Messagesof any type
For it’s part, the
Postmaster, when it is updated in the game loop, will post that message to all Subscribers. This allows the systems to communicate to each other without worrying about whether the other system exists or not.
So the higher-level physics code can post a “ball bounce” message and it simply doesn’t care if anyone else is doing anything with that message. The audio system listens for “ball bounce” messages and plays a sound each time, but it doesn’t care where that message came from. Everything is decoupled that way.
The messaging system also takes the responsibility of translating windows messages into messages that the game and engine are interested in. Key Down and Mouse messages.
My User Interface library relies on it heavily on this system.
This is everything needed to draw things on the screen. Pretty basic. But it also contains Special Effects that can be applied to individual draw calls, drawing to a texture, animation and textures.
These are the various overloads of functions to draw to the current render target. The target could be the back-buffer, or it could be another texture.
Drawing textures, shapes, and text is all handled here. This is where the calls to OpenGL functions are mostly found.
Depending on the draw call, some take a list of Special FX. These change the way the draw call is made. If you draw everything to a texture, and then draw that texture to the screen, it allows me to render special FX across the entire view.
Each of these is handled by case statements in the Vertex or Fragment shaders. I know that branching statements slow down shaders, but I’m not doing anything exciting, so this performance impact is negligible for me.
Even though the Windows API wrappers exist as a part of the Utilities library, the Window is really needed to add a Render Context for Open GL. So the window management exists in the rendering componenet of my engine.
Texture is either the data from a loaded image, or the OpenGL identifier for a texture in graphics memory.
Usually the game doesn’t handle Texture objects directly, but instead use Resource Handles from the Resource Manager.
Animations are exactly what you think they are. A series of Texture Resources with a frame-time in between them.
Like textures, these are usually handled as Resource Handles.
This is a fairly simple library that acts mostly as a wrapper around the relevant OpenAL and Ogg/Vorbis calls.
It does however expose audio data/files, typically as Resource Handles, in the same way the textures and animations are.
Some engines have some pretty indepth and involved asset/resource management. I don’t have anything nearly as complex as that.
Instead, I have a fairly simple system which keeps track of what resources are being used. When a resource is copied, it keeps track of the number of copies that are in “circulation”. The data is kept in the resource manager, with everything else just referring to an identifier.
When the resource manager detects that it’s the only system keeping hold of a resource, it deletes it from memory.
This also allows the Resource Manager to hook into input (via the Messaging System) and reload resources based as I want them to be. This means I can be at a specific part of the game and leave it there idle while I change textures. I hit a key and see the changes instantly at that very specific part of the game.
This resource management can be extended to anything. Even maps or levels. It can be extended by the game-specific code, not just engine components.
Everything that isn’t large that didn’t fit anywhere else goes in here. Vectors, Rectangles and the math associated with them are in here. Other things like converting a wide-character string to a c-style string is in here, rotating a point about another point etc is in here.
Then there’s slightly bulkier things that weren’t big enough for their own component (I’ll just cover the interesting things) [Part 4 goes into more detail]:
Having a timer object that can call an arbitrary functions after a set number of milliseconds is super useful in a thousand different ways. That’s what this does.
Really this only covers collision between rectangles, circles, lines, and points. But I haven’t come across any other needs considering I’ve only been making 2D games.
There’s no physics in this. I leave that up to the game-specific code.
This is where some of the most interesting code lives.
I have my implementation of the A* Pathfinding algorithm as well as an implementation of automatic dithering and colour quantization algorithms. Anything that’s not implemented in the C++ STL that I need to be able to do for games is in here.
This system uses the renderer fairly heavily as well as the Messaging system. Basically it’s a full widget-based UI library. Everything I could ever need is in here. This one library has probably taken the most time to code than any of the others combined.
It’s all pretty straight-forward controls. There’s nothing in it you haven’t seen before, but I coded it and it couples with the rest of my engine.
This is a large library that generates and manipulates various types of noise data. Smooth Gradient Noise is in here as well as other types like static, voroni etc.
It’s very useful for procedurally generating content.
Networking library using sockets. It’s implemented over UDP as a Server/Client architecture. There’s no reliability or ordering of packets; but it is something I plan on adding in the future.
Game Specific Code
Everything else to make the game is in here.
But this is also where if a game I start making has a bit of useful code, it eventually migrates into the game engine and becomes either a new component, or gets shovelled into an existing one somewhere.
At that time, it also gets documented properly so I have my own reference material.
I hope this was helpful to someone trying to make their own Game Engine, either as a hobby developer or not. In the next part we will go over the Audio System and what I’ve learnt from implementing that.