Extreme character detail
As with all character-driven stories, the devil’s in the detail. That’s why we created each character with a whopping 40,000 vertices. This is an extraordinarily high polycount for characters, even by current-gen console and PC game standards. Each character’s face and body have multiple 2048x2048 textures for skin color, reflectivity, normal maps, etc.
Did you notice the characters’ skin? We achieved exceptional realism by implementing sub-surface scattering (SSS) skin shaders which blur incoming lighting in screen-space, a process also known as screen space diffusion. We took a lot of what we learned in The Butterfly Effect and optimized it for mobile GPUs.
Textures and shading
Physically-based shading model
We accounted for energy conservation and the Fresnel effect when calculating lighting on dynamic objects and used Beckmann distribution for the specular component for both skin and metals.
To bring lustre to the hover bike’s metallic paint and other reflective surfaces we adapted the skin system for use with low-diffuse materials.
Local bloom for skin
Instead of using full screen post processing to achieve bloom, we piggybacked our screen space diffusion approach and evaluated diffuse skin components blooming directly in the skin shader. This allowed us to achieve the desired effect at minimal cost.
Linear space lighting approximation
Pretty much all modern console or PC games do gamma correction, however many mobile GPUs still do not support automatic gamma corrected (sRGB) reads and writes. To emulate sRGB we used gamma of 2 which allowed us to use square (multiplication with self) and square root operations which are much faster than calculating the power of 2.2, for example. We only corrected for skin pixels and for the crucial skin-shading third pass.
Part of our goal with this demo was to illustrate just how impressive backdrops can be on mobile platforms. To achieve this, we decided to go for a mobile-friendly atmospheric scattering system instead of just using a traditional distance fog. We simulated scattering in the air by taking the angle to the sun and the distance to the object into account. As a result The Chase can deliver an exceptionally large and open viewing environment with the visible object furthest from the camera being four kilometres away.
New lens flare system
Using layers of flares allowed us to achieve a variety of effects such as:
- Anamorphic phenomena: where flares move on the horizontal or vertical axis due to the anamorphic shape of the lens.
- Internal shadowing: where the intensity of the flare is partially culled because of the internal occlusion of the head mount or the lens’ uneven reflectance.
- Light leaking: where the flare’s geometry is stretched at the edge of the screen to simulate a very bright light source outside the frame.
How to wow
The Chase is set in an unrealised and distant future, but the techniques used to create it are available in the here and now. If you’re starting out as a mobile developer and you want to learn how to bring high-end graphics to your mobile game, our Practical Guide to Optimization for Mobiles is the place to start.
Google used The Chase at their Nexus7 / OpenGL ES 3.0 reveal keynote, and it was widely shown and admired in the SIGGRAPH 2013 demo reels of many leading exhibitors.