Testing video games
Posted: Sat Mar 14, 2026 10:04 am
Everyone tests their video games.
The simplest, most obvious form of testing is just turning the game on and playing a round.
This already ensures a couple things:
- the window system and graphics are initialized correctly (/on my machine/)
- controls work for my setup
- the fundamental gameplay loop works as I expect
- the game does not crash instantly because of something stupid
- ...
A big advantage here is that you can notice issues this way you could not predict.
You can get greater coverage of the state space just by playing more and adding more players (play testers).
But maybe there are some specific things you want to focus on.
A "gym" or a debug console for setting up a specific environment helps here.
As does quicksave/quickload.
And a list of requirements. A checklist if you will.
Fundamentally a test is nothing but a scenario with a desired outcome.
For really gnarly cases I'd recommend actually having a pre-release checklist that you go through, like a pilot before take-off.
I started doing that at work after a botched release :(
Doing all of this manually is a lot of work though, especially once a game grows.
It'd be nice if the computer could check things like "if I hit this small enemy in this specific angle, it does not fly through a wall".
Hell, I'd like to cover all angles while I am at it. And all enemies.
Replace above with whatever is a common problem in your game, there is always something.
For things like this some automatic testing is in order.
Automatic testing also follows the usual recipe for test: set up a scenario + check expected outcomes.
In a way it feels like nailing down the behavior of the game.
Ossification of APIs is a real concern.
Therefore I like to start writing tests from the most outer layer I can inject state into and read results out of.
I literally spawn the whole game state, set it up for a scenario and then let it progress a bit before doing asserts.
In the legacy simulation I maintain the only change I had to make to support this was putting a proxy call before the system clock and adding one function for injecting input data.
It is always a judgment call on what method of testing leads to the best results with the least amount of effort.
There are two areas where automatic testing especially shines which I want to mention: regressions and large scale refactors.
Regressions sneak in time and time again. If I notice a pattern there I build a test.
Refactors change the internals of the game, but are supposed to keep the behavior the same.
Pinning the behavior via tests is great for this.
How do you test your games? Have you tried automated testing (for games or simulations) before? Did it go well?
The simplest, most obvious form of testing is just turning the game on and playing a round.
This already ensures a couple things:
- the window system and graphics are initialized correctly (/on my machine/)
- controls work for my setup
- the fundamental gameplay loop works as I expect
- the game does not crash instantly because of something stupid
- ...
A big advantage here is that you can notice issues this way you could not predict.
You can get greater coverage of the state space just by playing more and adding more players (play testers).
But maybe there are some specific things you want to focus on.
A "gym" or a debug console for setting up a specific environment helps here.
As does quicksave/quickload.
And a list of requirements. A checklist if you will.
Fundamentally a test is nothing but a scenario with a desired outcome.
For really gnarly cases I'd recommend actually having a pre-release checklist that you go through, like a pilot before take-off.
I started doing that at work after a botched release :(
Doing all of this manually is a lot of work though, especially once a game grows.
It'd be nice if the computer could check things like "if I hit this small enemy in this specific angle, it does not fly through a wall".
Hell, I'd like to cover all angles while I am at it. And all enemies.
Replace above with whatever is a common problem in your game, there is always something.
For things like this some automatic testing is in order.
Automatic testing also follows the usual recipe for test: set up a scenario + check expected outcomes.
In a way it feels like nailing down the behavior of the game.
Ossification of APIs is a real concern.
Therefore I like to start writing tests from the most outer layer I can inject state into and read results out of.
I literally spawn the whole game state, set it up for a scenario and then let it progress a bit before doing asserts.
In the legacy simulation I maintain the only change I had to make to support this was putting a proxy call before the system clock and adding one function for injecting input data.
It is always a judgment call on what method of testing leads to the best results with the least amount of effort.
There are two areas where automatic testing especially shines which I want to mention: regressions and large scale refactors.
Regressions sneak in time and time again. If I notice a pattern there I build a test.
Refactors change the internals of the game, but are supposed to keep the behavior the same.
Pinning the behavior via tests is great for this.
How do you test your games? Have you tried automated testing (for games or simulations) before? Did it go well?