What are you looking for?
Case studies
How Kluge Interactive brought Synth Riders to Apple Vision Pro
Jun 5, 2024
Synth Riders by Kluge Interactive

Synth Riders, an immersive rhythm game, launched on Steam VR and Meta Quest in 2019, and it’s since been ported to numerous devices, including Apple Vision Pro. Here’s what the team learned developing a mixed reality experience for visionOS.

THE CHALLENGE:
Reimagining Synth Riders for Apple Vision Pro
PROJECT STAFF:
Around 80, with a team of 15 handling the release on Apple Vision Pro
PLATFORMS:
PC, HTC Vive, Oculus Rift/Quest, Meta Quest 2/3/Pro, Pico Neo 3, Pico 4, PlayStation®4/5, PlayStation®VR/VR2, Valve Index, Apple Vision Pro
LOCATION:
Marina del Rey, California

How does a studio reimagine a beloved title for Apple Vision Pro that melds their cyberpunk aesthetic with Apple’s minimalistic and colorful design?

Since Apple Vision Pro is a new and evolving platform, the Kluge Interactive team was aware of the potential hurdles they faced and knew they’d have to adapt to succeed. Their main focus was meeting the platform’s day-one launch deadline and polishing their game to the high standards ahead of media previews.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Preparing for a new challenge

“Our team has always been bullish about technology. The opportunity to develop for Apple Vision Pro came at the right time as we wanted to amplify the visual side of Synth Riders and were in the midst of redesigning the game’s UI,” says Arturo Perez, the chief executive officer at Kluge Interactive.

Any Synth Riders release brings platform-specific challenges. Once the team agreed to go for a new port or, in this case, reimagine the game for a new medium, they began preparing to work with the hardware. Over the past year and a half, the team released ports for PlayStation VR2 and Meta Quest 3, then quickly launched on Apple Vision Pro.

“The last 12 to 18 months of game development has been about paying down technical debt and moving our toolsets closer to the 2022 LTS,” explains AnnMarie Wirrel Bartholomaeus, a technical producer at Kluge Interactive. “We already moved from the Built-in Render Pipeline to the Universal Render Pipeline (URP), and switched to an OpenXR backend. That has helped put us in a much better technical position to work with new technologies.”

An immersive experience playing Synth Riders
Synth Riders by Kluge Interactive

In addition, they used a more modular approach and made a concerted effort to decouple gameplay from graphics and input. When they started a new port, this “Core” package managed things like loading music, generating menus, and populating interactable notes and rails in the game. Having that handled allowed the team to focus on how that data manifested on the specific platform to get the look and feel they wanted.

Bartholomaeus says, “The more we wanted to explore different concepts of what Synth Riders is, separating those pieces out has been fantastic. We used the Core to move forward with Synth Riders for Apple Vision Pro, and it was extremely helpful.”

The results

  • Created a new, reimagined version of their rhythm game for the day-one launch of Apple Vision Pro
  • Saved the cost of hiring an entire platform-specific team of 15+ artists, developers, and more
  • Received an average rating of 4.7/5 on Apple Vision Pro
A gameplay screenshot of Synth Riders by Kluge Interactive
Synth Riders by Kluge Interactive

Designing a responsive UI

As the Kluge Interactive team began reworking the UI for Apple Vision Pro, they quickly realized that Apple prescribed how to click on things and what a standard button should look like. Since eye-tracking data is not directly exposed to developers, they adapted and problem solved to create a working UI that played by all the rules.

Without access to eye-tracking data, a custom solution was out of the question. In order to work well with Apple Vision Pro’s built-in highlighting system, they recreated their entire UI in 3D.

“Using a standard canvas-based UI could work, but we experienced challenges around the layering of different transparent layers,” explains Miguel Mosquera, a developer at Kluge Interactive. “The main problem was the square shape that highlighted the canvas, the picture, and the 2D elements. It wasn’t aesthetically pleasing, and the highlight didn’t match the object.” The team moved to a 3D UI to avoid that issue, since 3D elements use a square mesh and an alpha texture to help.

Designing a responsive UI for Synth Riders
In-Editor shot of the Kluge Interactive team working on the party mode UI

Recreating the UI in 3D ensured that eye-gaze highlights matched the shape of the buttons, but that alone wasn’t enough to make the UI feel natural on the platform. The highlight effect was always a soft white glow around the borders of an object’s silhouette, and menu designers needed to take that into account. For example, they avoided light colors and blocky borders in favor of softer gradients with darker colors.

Although this added a full, separate workflow for the team, they embraced the new process and are very happy with the results overall. Perez says, “It was worth it. The UI is much cleaner than it is on other devices.”

Gameplay elements for Synth Riders by Kluge Interactive
Synth Riders by Kluge Interactive

Crafting performant effects

The team tried both the Replicate Properties and Bake to Mesh methods to add Shuriken Unity effects to their PolySpatial project. For the Replicate Properties option, the properties of the Unity Particle System are applied to a native particle system in Xcode. Using the Bake to Mesh method, the output of the Unity Particle System is converted to a standard mesh. The latter was considerably more resource intensive, but it supported a wider range of features that couldn’t be replicated with a native particle system.

The team felt that replicating properties was the better option in general, but ultimately they decided on the Bake to Mesh option. “Particle trails were a ‘must have’ feature that our 3D artist really wanted to use to create audio-reactive visual effects for the game, and this wasn’t feasible with the Replicate Properties option,” says Justin Dopiriak, a senior developer at Kluge Interactive.

Synth Riders by Kluge Interactive
In-Editor shot of the Kluge Interactive team working on real-time feedback with the visionOS simulator and Play to Device

The Bake to Mesh option was pretty taxing, and some of the particle systems they used onscreen had a big overhead. “We ran the system to its limits, and it was interesting trying to identify where the performance problems came from,” continues Dopirak. “Once we located and worked to fix the issues, we discovered that using pure vertices was a good solution.”

Since they had to bake an effect into the mesh every time, there was a much larger impact just from the vertex count than they saw on other platforms. In order to manage the overhead and optimize the resource savings, they were very deliberate in what they allowed themselves to use. The overall effect on the visual fidelity was negligible. “When we were more disciplined across the board in what we applied and how we restricted ourselves, we ended up with a near-perfect equivalent,” Dopiriak concludes.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Lighting the stage

To create shaders and lighting, the team turned to the tools and documentation for Unity PolySpatial. They discovered new nodes specifically crafted for visionOS that were helpful in the absence of lighting in the simulator, as well as a list of nodes that showed whether each was compatible with the platform. This proved beneficial when converting tangent spaces, managing reflection probes and view directions, and handling normal mapping.

The team used the PolySpatial Lighting Node as their main output node and plugged everything into place. The base color, normal, metallic, and smoothness stayed the same.

“It acted as a two-pass shader because on the PolySpatial side, the Apple Vision Pro side, the headset added reflections from our environment and some lighting based on the feedback the camera received,” explains Esteban Meneses, a 3D artist. “PolySpatial lighting managed everything that we knew the engine had inside, and there were options for the baked lighting. We debated using lightmaps and light probes, both of which were very useful.”

An in-Editor shot of the Kluge Interactive team working on Synth Riders
In-Editor shot of the Kluge Interactive team profiling particle performance

For the reflection probes, the team had access to simple and blended options. From there, they plugged everything that came from the PolySpatial Lighting Node into the base color and emission to be safe.

“There are passes that use the base color and we didn’t want to have that in black,” Meneses continues. “I added an intensity modifier to crank up values and gain artistic freedom. I used Apple Vision Pro’s reflections variable and turned off the ambient occlusion in the final node, so we didn’t see the reflections of the environment.” This control over reflectivity allowed the team to divide their game neatly between objects that appeared to be in the room with the user and those they saw through a portal.

The gameplay elements in Synth Riders by Kluge Interactive
Synth Riders by Kluge Interactive

Giving hand tracking a go

As the team delved into reimagining the game, they made their first real foray into hand tracking. Although they were initially hesitant about the lack of haptics, they loved the results. “Typically, when you hold a controller, your hand ends at your wrist, but with hand tracking, your fingers become important and you can gesture and interact with the music in a different way. It brought a new dimension to the game,” says Bartholomaeus.

Without controllers, they adapted to a new way of input. They also needed to fill the gap for the connection a player gets with haptics in a non-intrusive way. Bartholomaeus explains, “We worked on audio reactive effects, lighting, and different particles that fostered the connection with the song based on their action. The interaction with the rails was particularly important.”

The team leaned on platform documentation to inform themselves on the skeleton offsets with Apple Vision Pro that are not on other platforms. As far as gameplay goes, they tracked the wrist as their sole point of extrapolation. Since they chose a blend of hand tracking for the gameplay elements and the spatial tap for everything else, they didn’t do custom gestures. “We took advantage of what the operating system offered us. I encourage people to use that. It was really performant,” says Bartholomaeus.

A player immersed in Synth Riders
Synth Riders by Kluge Interactive

Rendering and optimizing performance

The team also had to learn a new rendering process. For other platforms, what they created was typically rendered directly in Unity, but with visionOS, there was a second step in RealityKit. One of the main challenges the team overcame was not using a Line Renderer, a component that takes an array of two or more points in 3D space and draws a straight line between them. This component was a huge asset to the team for their rail system on other versions of Synth Riders.

“When we encountered this obstacle, the Unity PolySpatial team was fantastic to brainstorm with,” says Bartholomaeus. “For the rails, a feature that was not yet supported on the platform, they suggested that our technical artist recreate it from scratch with a standard mesh and vertex displacement, which worked really well.”

An in-Editor shot of the Kluge Interactive team working on Synth Riders
In-Editor shot of the main stage view in Synth Riders

When it came to performance optimization, one of the main lessons they learned was to not make assumptions based on their experiences targeting other platforms. As the team went through the entire game, testing optimization paths to speed up performance piece by piece, they discovered that solutions that had reduced the impact of material instancing in URP in other platforms did not apply to Apple Vision Pro. “Once we located this issue, we changed a couple of lines of code. It was an easy fix and the improvement was dramatic,” says Dopiriak.

The team also debugged performance by using the Unity Profiler, Xcode Debugger, and the Play to Device feature. “At the beginning, it was hard to test and find issues,” explains Mosquera. “The Play to Device feature helped us localize problems faster, and, without it, the project would have taken longer to complete.”

The gameplay elements in Synth Riders
Synth Riders by Kluge Interactive

Taking it to the next level

Since its inception as a design studio in 2007, Kluge Interactive has embraced emerging technology and trying new things. Although Synth Riders is in its fifth year, the team foresees a long tail of new content, and they are intent on actively building features and elevating the visual-music relationship on Apple Vision Pro.

“Design is such a big part of our DNA, so seeing how there was so much intention in this device and in the OS of it has continued to inspire us. I know we can reach higher, and I’m excited to see what’s next,” says Perez.

Build for Apple Vision Pro today

Talk to our team to learn how we can help you leverage Unity’s powerful tools and workflows to build compelling spatial experiences.

Explore the case study

Fill out this form to get access to the latest customer success stories