
Learn how PikPok’s programmers and lighting artists built, optimized, and implemented a dynamic, real-time lighting setup for Into the Dead: Our Darkest Days using the High Definition Render Pipeline (HDRP) in Unity 6.
How do you create realistic lighting for a horror game? PikPok’s Into the Dead: Our Darkest Days is a 2.5D survival shelter game set during a zombie apocalypse. Players control groups of survivors attempting to escape the city, exploring abandoned buildings to gather supplies and fortify their temporary bases against nightly undead attacks.
The game’s art direction emphasizes performance-heavy real-time lighting and atmospherics to convey the instability of a city in decay. Each level consists of a detailed 3D environment navigated on a 2D plane, and this 2.5D perspective created technical challenges for PikPok’s lighting and environment artists to solve. The sheer number of dynamic lighting states in each level – from day/night transitions, to fluctuations in the in-game power grid – often meant rendering dozens of real-time lights simultaneously, sometimes with multiple lighting scenarios visible at once.
Learn how PikPok approached these lighting challenges while using HDRP for the first time, and perfected a vintage look for their latest horror game.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
PikPok made several calculated game design decisions to reposition Into the Dead for a more “hardcore” audience using more powerful PC hardware. “This third game is a very ambitious introduction to [Into the Dead],” says Karah Sutton, the studio’s chief publishing officer. “It’s much more management-focused, and introduces a lot more lore, more characters, and more depth to the gameplay.”
Previous Into the Dead titles were nebulously set in 1980s Texas, but Our Darkest Days makes it official, with events occurring in August, 1980, in the fictional city of Walton, Texas. The art direction aimed to bring this setting to life by evoking an authentic ‘80s film look without relying on post-processing techniques like grading.
“We wanted the colors to come naturally through the camera, like they did in films back then, versus nowadays where you'd push blues into the shadows or yellows into the highlights in post,” explains E’van Johnston, principal artist at PikPok. “The idea was to use lighting to solidify a point in time, and everything kind of evolved from there.”

Light and shadow have a critical role to play in a horror game where everything is technically on the screen at once. “In a 3D game, what you’re seeing is controlled by the direction you’re facing. With 2.5D, you’re only seeing what we’ve decided you can see,” says Harriet Prebble, creative content strategist at PikPok. “When a zombie’s hiding in the shadows and then revealed in a flicker of light – that single lighting change introduces fear to every unseen area.”
Into the Dead: Our Darkest Days's in-game power system allows lights to flicker realistically in the hospital level
Bringing this ambitious reenvisioning of Into the Dead to PC players required a graphics renderer capable of generating high-quality, physically-based atmospheric lighting. During preproduction, PikPok used the Universal Render Pipeline (URP) in tandem with third-party tools to imitate volumetrics, but the approach became increasingly unsustainable as development progressed.
The team decided to investigate HDRP, which offered volumetric lighting and fog solutions out of the box. And because Into the Dead: Our Darkest Days is being made in Unity 6, , artists could also take advantage of new HDRP features like IES light profiles and screen-space global illumination (SSGI) – tools that senior programmer Sam Win-Mason says were “essential” for executing on the look of the game.
“The most impactful decision we made about the lighting in Into the Dead: Our Darkest Days was to make the entire pipeline real-time,” he says. “From a process standpoint, our locations can be visited at different times of the day, and a workflow that required statically baking each level in various lighting conditions would have slowed iteration times with the large number of levels we needed to make.”

Each level in Into the Dead: Our Darkest Days consists of an exterior environment and at least one multi-room building. During missions (or “scavenges” in-game), players scour abandoned houses, churches, record stores, police stations, arcades, and more for supplies, exploring each environment from a side-on view. Transitions between indoors and outdoors are common, with players leaping from windows or scrambling up fire escapes to safety. And in situations where they’re plunged into darkness, players rely on a (real-time) flashlight to illuminate their surroundings.

“Normally in games, you’re only having to light the characters’ immediate environment to a certain degree and you can phase off some of the background, or occlude stuff,” says Sam. “In our case, we’ve got many, many rooms on screen at once, all lit at a very high fidelity, and many, many lights in real-time for things like dynamic shadows, flickering attenuation, and other techniques we’re using to create the horror. This perspective was obviously very challenging for lighting performance.”

Because Into the Dead: Our Darkest Days’s lighting pipeline requirements are so demanding, everything in the game is optimized around them. Before levels are even added to a build, they go through a rigorous series of optimizations to make them as performant as possible. To aid the art team in this task, PikPok’s tech team created a collection of custom Editor tools for optimizing levels and real-time 3D lights.
The level optimization system includes a tool to help identify situations that can lead to shadow atlas overflow. It simulates a game-camera view of a scene while it’s being constructed in the Editor and tags any objects below a certain screen-space size. Anything that doesn’t contribute meaningfully to the lighting in the level can have its shadow disabled with a click, saving valuable shadow atlas space at runtime.

Another custom tool, the light optimizer system, also helps ensure performance bottlenecks don't cast a shadow on the player experience. “The most expensive thing with real-time lighting is definitely drawing shadow casters,” says Sam. “The core of our runtime systems is our light optimizer system. When a level is created, artists assign a priority to the lights in a scene and that works with our comprehensive light LOD system to dynamically downscale the shadow map resolution of lights further away from the player character or playspace.”
Thanks to the light optimizer system, lights can be set to only update their shadows when they or their light cone of influence are visible on screen. Shadow map rendering is set to occur on a round-robin schedule, minimizing CPU and GPU load. “None of these systems would have been possible for us without HDRP's granular control over light shadow map caching and rendering,” Sam notes.
The power system at work in the hardware store level
PikPok’s artists use a multiscene setup to work on environments for Into the Dead: Our Darkest Days. Each scene has a corresponding day and night version containing high-level prefabs defining its respective lighting conditions, with the idea that more conditions can be added as development progresses.
“These prefabs contain a directional light representing the Sun or Moon, and also carry default lighting profiles to define things like our physically based sky, cloud layers, exposure settings, local fog volumes, as well as any atmospheric effects we want to attribute to a particular lighting condition, like dust blowing across the ground," says E’van.

Developing Into the Dead: Our Darkest Days in Unity 6 meant the lighting team had access to IES light profiles to help situate the game in the 1980s. IES (Illuminating Engineering Society) light profiles are industry-standard data files that define the precise shape and intensity distribution of light emitted by real-world light sources like lamps, bulbs or fixtures. Many manufacturers provide their own data profiles so lighting artists can accurately replicate the ambience of their products. “They’re basically light cookies on steroids,” says Sam.
“Most of our lights in-game are using IES profiles to create that attenuation effect on the light projected into the scene,” says E’van. “It helps with the physicality and the sense that these are real lights, as opposed to a CG light that doesn't have any detail or nuance in the projected light beam. They affect things like volumetric fog as well.”
“When you have that kind of '80s style lamp in the living room or a type of street light in a sparse environment, you can identify the setting based on the lighting choices – from the quality or type of bulb, to the color of the light,” adds Harriet. “We can almost sell the beauty of Our Darkest Days’s vintage setting by lighting alone.”

Currently, reflection probes are the only form of baked lighting in Into the Dead: Our Darkest Days, and even those have a real-time component. “Our baked reflection probes contain a dynamic component that drives the probe’s multipliers in response to the high-level power system,” says E’van. “If lights go out or fluctuate for some reason, the reflections in the scene need to respond to that. In these situations, we drive the multiplier down on the reflection probe down to 0.01, which gives the effect of the reflective light in that scene modulating with fluctuations in Walton’s power grid.”

In addition to navigating the performance hurdles of lighting up Into the Dead: Our Darkest Days in real-time, PikPok’s art team had to tackle the creative challenges of the game’s 2.5D perspective.
Volumetric fog is almost a given for a high-definition horror game, but a cutaway perspective makes its creeping tendrils difficult to contain. “Our cross-section dollhouse view means the fog, in terms of global fog, can’t come forward through that cross-section towards the camera,” explains E’van. “We use local fog volumes to limit fog to that cross-section and forward into the scene. It can be fairly bespoke because we also don’t want fog leaking into interior spaces.”

Another major challenge was establishing a parent exposure setting for dark, moody interiors and bright, hot, exterior environments. “With a normal camera, in a dimly lit room the exposure would change to lighten up the scene, and outside, the exposure’s going to change to keep the values reasonable,” says E’van. “We had to come up with a way to have both spaces seemingly exposed correctly while in the same frame.”
After some trial and error, the lighting artists landed on an approach where exposure is set manually. “The sun has physically accurate intensity, while all interior lights have much higher intensity values, breaking away from physical accuracy,” explains E’van, who noted that this small compromise was necessary for providing a better gameplay experience. “We’ve got zombies outside, we’ve got zombies inside, and we need the player to be able to see these threats and make decisions without the camera or the exposure making it difficult for them.”

Into the Dead: Our Darkest Days is a significant evolution for PikPok’s celebrated franchise, and a massive win for the studio’s art team. They successfully built a real-time lighting pipeline for an ambitious new take on high-fidelity horror, while using HDRP for the first time, without compromising on what they wanted to achieve. “We found HDRP’s unified lighting solutions to be much easier to work with as artists, and we were able to get it up and running quickly,” says E’van. “It really gave us the confidence that we could deliver on the original vision for Into the Dead: Our Darkest Days’s art direction.”
PikPok is currently evaluating Unity 6.1 to take advantage of variable rate shading to further optimize GPU performance. And for players, the darkest days have just begun. Into the Dead: Our Darkest Days’s first major early access update introduces many new features, including a new gameplay mechanic: curveballs, environmental and world events that can positively or negatively affect players' shelters over time.
Sam teases a possible lighting-related curveball that could appear in a future update. “We had this idea for area-wide blackouts. Obviously we need dynamic lighting to simulate all this and make it look good. That’s why we created that system that modulates all the lights and reflection probes. It’s pretty involved, but if we can make it work, it could be really cool.”
Power flickering in the abandoned arcade level
Deliver visually stunning, unique games to more players across more devices with Unity 6's enhanced graphics and lighting performance. Try it today or find new audiences on more than 20 platforms with Unity Pro.