Virtual Cinematography

by Digital Monarch Media
The studio
Two worlds colliding

When two veterans of the film and game industry, Wes Potter and Habib Zargarpour, joined forces to create a virtual cinematography company, they were trying to fill a void in production because directors and DPs were becoming more and more removed from their work on CG productions.

The result was Vancouver-based Digital Monarch Media (DMM), where traditional filmmaking ideologies meet real-time game-engine technology via their Virtual Film Tools suite. Their goal was “To bring the creative process back into the shots on set, as if it was all practical shooting, as if it was all live ... bringing directors front and center with their films again,” explains Potter.

They chose to leverage their experience with real-time engines to bring that immediacy to the traditionally slow VFX pipeline. Because real-time engines render so quickly, directors are able to immediately see the impact of their changes, while enabling more spontaneous creativity and quick iterations on the stage floor. As Potter points out, “We go to the movies to see the work of these artists. We want to feel the director’s breath on each shot,” and these tools allow for just that.

Wes Potter, Founder and CEO, and Habib Zargarpour, COO

The project
The future today

With Unity, DMM created tools they call “Expozure,” “Cyclopz” and “Hermes.” Accessed on a customized handheld tablet or VR headset, they give the director ultimate control in CG production, letting them direct camera, lighting and set changes in real time on stage, rather than waiting for the final renders of the shots.

DMM’s Virtual Film Tools are used on some of Hollywood’s biggest features

DMM’s Virtual Film Tools are used on some of Hollywood’s biggest features

DMM’s Virtual Film Tools have been employed on some of Hollywood’s most successful recent films. The tools were used to make high-end, real-time filming possible on Disney’s Oscar-winner The Jungle Book, and most recently, Spielberg’s Ready Player One and on Oscar-winner Blade Runner 2049.

Because teams often have only a few minutes to design a new camera/object rig on the motion-capture floor to cater to the director’s wishes, they appreciate Unity’s flexible and rapid environment, aided by having an engineer on the stage floor. “It really collapses the iteration time,” says Potter.

Virtual cinematography hardware with Expozure as used on Blade Runner 2049

Virtual cinematography hardware with Expozure as used on Blade Runner 2049

For example, on Ready Player One, Potter had to rebuild a pipeline to accommodate a requirement of Steven Spielberg. He was able to quickly redo the entire thing live, on set, and have it ready to suit Spielberg’s timetable. Potter credits Unity for being able to do this quickly, because “it is such an adaptable engine.”

Customized hardware created for use on film sets

Customized hardware created for use on film sets

The reveal
Magic for teams

Leveraging Unity’s open and flexible architecture, DMM’s virtual cinematography environment lets directors and DPs experiment on the fly, as Potter explains: “On Blade Runner 2049, Villeneuve was able to re-envision shots for some of the digital scenes well after much of the editing was complete, creating a desired mood and tempo for the film.”

Created in Unity, Expozure allows the director to change lighting, lenses and staging

With the DMM tools, directors like Spielberg and Favreau can direct their actors on the mocap stage because they can see them in their virtual forms composited live into the CG shot. They can even judge the final scene with lighting and set objects in detail. “It’s a holistic place for decision-making,” says Zargarpour.

What the director is seeing, either through the tablet or inside a VR headset, can be closer to final render, due to Unity’s graphical quality – which is light-years from where directors used to be – before real-time technology became part of the shoot.

The range of production touched by DMM’s Unity tool

The range of production touched by DMM’s Unity tool

Camera data is sent from the mocap stage to Unity, into DMM’s listening node called “Hermes” and onto Expozure.

All the departments at your fingertips

All these objects are synchronized across all DMM’s tablets or portals (i.e., Vive), so everyone can participate in the experience.

This is magic for teams working on a previs. If one person moves an object, everyone sees the change. The director can animate the camera, move objects and lighting, do almost anything while having every department at their fingertips.

And because final camera animation generated in Expozure often survives postproduction all the way through to the final edit, filmmakers are starting to see the impacts of real time on the future of their craft.

“It’s completely inevitable that you’ll be able to get finished, quality VFX shots in real time,” says Zargarpour.

Interested in building tools with Unity’s open and flexible architecture?

We use cookies to ensure that we give you the best experience on our website. Visit our cookie policy page for more information.

Got it