Cinematografía virtual

por Digital Monarch Media
The studio
Two worlds colliding

When two veterans of the film and game industry, Wes Potter and Habib Zargarpour, joined forces to create a virtual cinematography company, they were trying to fill a void in production because directors and DPs were becoming more and more removed from their work on CG productions.

The result was Vancouver-based Digital Monarch Media (DMM), where traditional filmmaking ideologies meet real-time game-engine technology via their Virtual Film Tools suite. Their goal was “To bring the creative process back into the shots on set, as if it was all practical shooting, as if it was all live ... bringing directors front and center with their films again,” explains Potter.

They chose to leverage their experience with real-time engines to bring that immediacy to the traditionally slow VFX pipeline. Because real-time engines render so quickly, directors are able to immediately see the impact of their changes, while enabling more spontaneous creativity and quick iterations on the stage floor. As Potter points out, “We go to the movies to see the work of these artists. We want to feel the director’s breath on each shot,” and these tools allow for just that.

Wes Potter, Founder and CEO, and Habib Zargarpour, COO

The project
The future today

With Unity, DMM created tools they call “Expozure,” “Cyclopz” and “Hermes.” Accessed on a customized handheld tablet or VR headset, they give the director ultimate control in CG production, letting them direct camera, lighting and set changes in real time on stage, rather than waiting for the final renders of the shots.

DMM’s Virtual Film Tools are used on some of Hollywood’s biggest features

DMM’s Virtual Film Tools are used on some of Hollywood’s biggest features

DMM’s Virtual Film Tools have been employed on some of Hollywood’s most successful recent films. The tools were used to make high-end, real-time filming possible on Disney’s Oscar-winner The Jungle Book, and most recently, Spielberg’s Ready Player One and on Oscar-winner Blade Runner 2049.

Because teams often have only a few minutes to design a new camera/object rig on the motion-capture floor to cater to the director’s wishes, they appreciate Unity’s flexible and rapid environment, aided by having an engineer on the stage floor. “It really collapses the iteration time,” says Potter.

Virtual cinematography hardware with Expozure as used on Blade Runner 2049

Virtual cinematography hardware with Expozure as used on Blade Runner 2049

For example, on Ready Player One, Potter had to rebuild a pipeline to accommodate a requirement of Steven Spielberg. He was able to quickly redo the entire thing live, on set, and have it ready to suit Spielberg’s timetable. Potter credits Unity for being able to do this quickly, because “it is such an adaptable engine.”

Customized hardware created for use on film sets

Customized hardware created for use on film sets

The reveal
Magic for teams

Leveraging Unity’s open and flexible architecture, DMM’s virtual cinematography environment lets directors and DPs experiment on the fly, as Potter explains: “On Blade Runner 2049, Villeneuve was able to re-envision shots for some of the digital scenes well after much of the editing was complete, creating a desired mood and tempo for the film.”

Expozure, creado en Unity, permite al director cambiar la iluminación, los objetivos y la puesta en escena

With the DMM tools, directors like Spielberg and Favreau can direct their actors on the mocap stage because they can see them in their virtual forms composited live into the CG shot. They can even judge the final scene with lighting and set objects in detail. “It’s a holistic place for decision-making,” says Zargarpour.

What the director is seeing, either through the tablet or inside a VR headset, can be closer to final render, due to Unity’s graphical quality – which is light-years from where directors used to be – before real-time technology became part of the shoot.

The range of production touched by DMM’s Unity tool

The range of production touched by DMM’s Unity tool

Camera data is sent from the mocap stage to Unity, into DMM’s listening node called “Hermes” and onto Expozure.

Todos los departamentos en tus manos

Todos estos objetos están sincronizados a través de todas las tabletas o portales de DMM (es decir, Vive) para que todos puedan participar en la experiencia.

Esto es magia para los equipos que trabajan en previsualización. Si alguien mueve un objeto, todos verán el cambio. El director puede animar la cámara, mover objetos y luces, y hacer casi todo mientras tiene todos los departamentos al alcance de sus manos.

Y dado que la animación de cámara final generada en Expozure suele sobrevivir a la posproducción hasta llegar a la edición final, los cineastas están empezando a ver el impacto del tiempo real en el futuro de su trabajo.

"Es completamente inevitable que puedas ver tomas con efectos visuales de calidad terminados en tiempo real", dice Zargarpour.

¿Te interesa crear herramientas con la arquitectura abierta y flexible de Unity?

Usamos cookies para brindarte la mejor experiencia en nuestro sitio web. Visita nuestra página de política de cookies si deseas más información.

Listo