What are you looking for?
Games

The Game Kitchen on 3 technical challenges making The Stone of Madness

/ THE GAME KITCHENGuest Blog
Mar 6, 2025|11 Min
Key art from The Stone of Madness by The Game Kitchen | Made With Unity

Earlier this year, The Game Kitchen launched The Stone of Madness, a tactical RPG where players help five inmates escape from an inquisitorial prison. In this guest post, three devs from the studio share how they tackled rendering, UI, and testing challenges during development.

We’re The Game Kitchen, and we recently released The Stone of Madness on PC and consoles. We want to share some of the most pressing challenges we faced during the development of our latest project, approaching them from a technical perspective with practical examples. In this collaborative article, our programming team breaks down key solutions we implemented in Unity to optimize both performance and development efficiency.

First, Adrián de la Torre (graphics programmer) will explain how we designed and rendered the game's art pipeline to achieve its distinctive visual style.

Next, Alberto Martín (UI programmer) will detail how we leveraged Noesis to streamline UI development, enhancing the workflow with UX improvements based on user feedback.

Finally, Raúl Martón (gameplay programmer) will showcase how we externalized and automated tests for complex in-game actions on a server, ensuring that multiple corner cases were handled without disrupting integration.

Making madness look good: A look at the custom render pipeline

Adrián de la Torre, Graphics Programmer, The Game Kitchen
The Stone of Madness combines 2D visuals with 3D gameplay mechanics, which presents a unique technical challenge. While players see a 2D world, the game's underlying systems operate in three-dimensional space, creating a distinctive duality in its design.

To address this challenge, our development team created a custom rendering pipeline that effectively bridges the gap between 3D gameplay information and 2D visual representation. This solution implements multiple rendering passes and specialized techniques to maintain visual consistency while preserving the intended gameplay depth, allowing for seamless translation of 3D elements into the game's distinctive 2D art style.

In The Stone of Madness, there are two main scenarios that contribute to the rendering of a frame.

The first scenario, which we call the Proxy Scenario, is comprised of geometric primitives that calculate the lighting of the final frame.

View of the underlying 3D scene in The Stone of Madness by The Game Kitchen – Made With Unity
View of the underlying 3D scene (Proxy scenario)

The second scenario is the Canvas Scenario, which consists of sprites that match the Proxy geometry’s shape and position. The Canvas is arranged in layers to simulate 3D space and achieve proper Z-sorting with moving game elements.

View of the 2D scene (Canvas scenario)


The following section details each step in our graphics pipeline for frame rendering.

Overview of the game's rendering process

1. Cone of vision

Whenever a cone of vision or game ability is enabled, it initiates the first step in the pipeline. We position a camera at the NPC’s point of view (PoV) to render the depth of proxies within its field of view (FoV).

Depth texture generated by the camera positioned at the NPC's point of view
Depth texture generated by the camera positioned at the NPC's point of view

Then, in another render texture, the camera outputs a gradient of the distance from the player’s origin in the B channel, which is used for skill area effects.

Gradient texture generated by a camera positioned at the player's point of view
Gradient texture generated by a camera positioned at the player's point of view

Using the NPC’s PoV render texture, the cone of vision camera renders a cone over the previous texture in the R and G channels with information about obstacles and distance.

Cone of view overlaid on the gradient texture
Cone of view overlaid on the gradient texture

The final pass renders sound waves in the Alpha channel.

Sound waves overlaid in the alpha channel of the gradient texture
Sound waves overlaid in the alpha channel of the gradient texture

This is the final texture created in this step, which will be used in the Canvas Camera step to render the scene’s sprites.

Final texture used to render skills' range and vision cones
Final texture used to render skills' range and vision cones

2. Canvas Render ID Camera

Each proxy in our project has an associated Render ID (a float value). The proxy and its related sprite share the same Render ID. In this step, we render the Render ID float value into a render texture.

Render ID texture. Each color represents a unique render ID shared between a proxy and its corresponding sprite
Render ID texture. Each shade represents a unique render ID shared between a proxy and its corresponding sprite.

In the subsequent step, we use this texture to match the lighting information calculated in the proxy scenario with the sprites in the Canvas Scenario.

3. Lighting

The lighting in our game consists of:

  • Baked lighting: Natural lights that remain permanently active, such as exterior lighting
  • Mixed lighting: Static lights in the scene that can be toggled on and off, such as candles
  • Real-time lighting: Light that moves throughout the scene and can be toggled on and off (we implemented this in only one instance, Alfredo’s oil lamp)

Overview of the different types of lighting in The Stone of Madness

Using the RenderID texture, we create a render texture containing the lighting information from the proxy scene.

Shadow texture generated from the Render ID texture and lighting calculations
Shadow texture generated from the Render ID texture and lighting calculations

4. Canvas Camera

After creating all render textures, a camera begins rendering the sprites with information about lighting, skill areas of effect, cones of vision, and noise waves.

Overview of the sprite rendering process

5. Post-processing

Color grading, vignetting, and other effects are applied in a post-processing pass.

Overview of the post-processing effects being applied

6. UI

Finally, the UI is overlaid.

Overview of the UI overlay process

Madness in the HUD: Speeding up UI processes

Alberto Martín, UI Programmer, The Game Kitchen

The final release version of The Stone of Madness features over 50 user interfaces. The reason behind that number is that this game has a lot of data to show the user. Our UI work was very time consuming, especially with how small the team was at the start, and so we were continuously optimizing our processes to ensure we were achieving good results in as little time as possible.

Our UI work spanned the whole project, so it was important that our UI/UX designers clearly understood all the features we needed to implement. To ensure that our game provided a good user experience and was fun to play, we were careful to keep an open line of communication between the programming and design teams.

To create the best versions of all of our UI components, we needed to remove the silos between our technical teams and our creative/research teams so everyone was actively involved in the game’s development. Here’s how we approached this two-part workflow.

Research and creative’s role in UI design

Our UI/UX designers are responsible for defining how UI elements will look in the final game, and ensuring we deliver a satisfying user experience. With this in mind, they began by creating each element with minimal technical load and validating it with potential users. That process looked like this:

  1. Requisites: Understanding the player’s needs and creating a list of the game’s needs and user goals
  2. Investigation: Looking at other games to see how they handled similar problems
  3. Wireframes: Working on the schematics and the structure (no final art at this point)
  4. Mock-up: At this point, we mount the almost fully designed interface with previously created elements (buttons, scrolls, frames, etc.), allowing us to iterate without much effort
  5. Prototype: We build a prototype on Figma using our mock-up, simulating interactions with gamepads and keyboard/mouse to show how it will work in a real environment.
  6. User test: Using our previously created prototype, we initiate a user test, validating the needs and goals we identified in Step 1.
  7. Iteration phase: If the user test meets expectations, it’s passed on to technical part processes, make more iterations, or perform further testing if it’s convenient.

Design prototype showing Inventory tab navigation in The Stone of Madness

Technical UI implementation

As mentioned previously, the number of UI elements in The Stone of Madness is huge. Developing a UI engine is expensive, so we needed to use a framework that was easy to learn with decent tools and workflows. After evaluating a range of middleware, we choose Noesis GUI, which follows the Model-View-ViewModel (MVVM) pattern.

We chose Noesis because it’s based on WPF (Windows Presentation Framework) and follows the MVVM model in a manner that we can reuse most documentation, bibliography, forum entries, and so on to troubleshoot the majority of issues. This framework has been around for a while – it’s now 18 years since its first release – and is familiar to a large number of UI devs, which gives our studio the option to hire from a comparatively larger talent pool to implement interfaces and tools for our projects. Another important thing about Noesis is that we can use the same tools from WPF.

With XAML, our UI creative team was involved in layout work and polishing all the elements with minimal technical involvement. Thanks to the MVVM approach, our technical UI programmers could focus on functionality and provide support to the creative teams in certain areas when necessary.

Testing (or, how not to go mad creating a game with a systemic design)

Raul Martón, Gameplay Programmer, Teku Studios

Gameplay in The Stone of Madness is based on three fundamental pillars: Player skills, NPC AI, and scene interactions. Each of these three systems are fundamentally intertwined, which exponentially increases the number of situations the player needs to control – and the number of scenarios we need to test.

As soon as we started the project, we realized that a traditional QA system was going to be insufficient. There were simply too many scenarios that depended on several pieces interacting with each other in a particular way, creating an uncontrolled situation. Moreover, these situations could well occur in a window of time that’s just too small for a QA team to test comfortably.

To solve these problems we created a suite of automatic tests. The idea was that all the possible scenarios/situations that could occur to our development team in relation to a particular system, could be accounted for and automatically tested much more efficiently in a simulated game environment.

To provide an example, one of The Stone of Madness’s lead characters, Amelia Exposito, has a pickpocket ability. While implementing this skill, we initiated a series of tests to ensure:

  • The basic functioning of the skill was correct: When stealing from an NPC, the pickpocketing mini-game would open and the game would pause until it’s over.
  • Less common situations are also covered: If you try to steal from an NPC while another NPC (like a guard) is watching you, or if the NPC is running, the action is impossible.
Automated playtesting of the Pickpocket skill, verifying that it behaves as expected across different in-game scenarios.
Automated playtesting of the Pickpocket skill, verifying that it behaves as expected across different in-game scenarios.

Creating an integration test

Each integration test we created required setup based on the following requirements:

1. A scene specially prepared to create this particular situation

To test the pickpocket skill, we created a scene with two guards and one player. We positioned each character so they’re facing in the direction needed for the situation to be tested accurately (remember, the player can’t use pickpocket if they’re within the FoV of a guard).

Additionally, the scene should only include the minimum components necessary to test the scenario, as extraneous elements can add noise to the measurement. This is why our example scene has no HUD, manual input system, sound effects, and so on.

  • This step requires that the game structure is well compartmentalized, which can take some effort, but, once achieved, is well worth it! 😉

2. A test code capable of forcing the situation to be tested

Many of the situations we needed to test can be difficult and time consuming to create manually and need a code push to initiate.

For example, if we want to create a test scenario to ensure our NPCs never step on mousetraps unless the NPC is moving, the chain of instructions would be:

  1. Launch the scene
  2. Wait one second
  3. Spawn a mousetrap under the NPC
  4. Wait another second
  5. Command the NPC to start walking in any direction

This part of the project is very sensitive to any changes during development (dependent on factors like changing game specs and various unexpected scenarios), so it’s critical that both the test code and resulting feedback are as clear as possible.

There’s nothing worse than a test that fails without giving any clear information about what’s actually going wrong.

3. A reliable way of knowing whether the scenario is working as intended, or whether the test has detected an error in logic

Automated testing still requires oversight. Increasing numbers of tests with greater specificity on what’s being tested can become difficult to monitor, or scenarios end up not being tested for long enough to be statistically significant. To get around these problems, we created custom tools.

For example, some of our tests involved combined interactions between several NPCs in a scene. To monitor these cases properly, we created a system to log the different AI states that NPCs cycle through during the test.

A guard NPC fails to follow the expected pursuit sequence during an automated test.
A guard NPC fails to follow the expected pursuit sequence during an automated test.

We also needed a good API that would give us visibility into the current game state (has an NPC been knocked unconscious? Has an NPC entered a routed state? How many times? Which player character has been captured? And so on).

4. A system to be able to launch all these tests quickly:

Unlike unit tests, automated tests must be conducted with the game running in real-time. This can make running these tests very slow.

In these circumstances, we’re able to take advantage of the fact that our game does not use Unity’s standard updates system. Instead, all of our components use a Tick() function, which simulates Unity updates but launched in a controlled way by our game engine.

This helped us achieve a couple of different goals with our tests:

  • First, we could speed up their execution with a forcing function that runs several frames of code for every frame of the game.
  • Second, because these tests are conducted in real-time, they’re very susceptible to variations caused by the frame rates on the computer running the testing scenario. By converting them to a controlled frame rate, we avoid this variance: If a test passes on one machine, it will pass on all machines, and, and vice versa.

And this would be the result.

Speeding up automated tests by controlling game updates: The Game Kitchen's custom Tick() system allows tests to run multiple frames per game frame, ensuring faster execution and consistent results across different machines.

How secure testing helps us avoid broken builds

With the creation of this test suite, we also needed to implement a safeguard that would automatically interrupt the merge of a branch if it contained bugs. To ensure this, we create an automatic merge script that launches every time a change is committed to the main project branch.

This script makes sure to launch all these tests and monitor their results. If any test fails, it returns with an error detection and interrupts the merge.

With this system, we can avoid situations where a change in an apparently isolated system breaks other mechanics it interacts with.

Thank you to The Game Kitchen for sharing this behind-the-scenes look at The Stone of Madness's development. Explore more Made With Unity games on our Steam Curator page and get more developer insights on Unity’s Resources page.