Cinematografia virtual

por Digital Monarch Media
The studio
Two worlds colliding

When two veterans of the film and game industry, Wes Potter and Habib Zargarpour, joined forces to create a virtual cinematography company, they were trying to fill a void in production because directors and DPs were becoming more and more removed from their work on CG productions.

The result was Vancouver-based Digital Monarch Media (DMM), where traditional filmmaking ideologies meet real-time game-engine technology via their Virtual Film Tools suite. Their goal was “To bring the creative process back into the shots on set, as if it was all practical shooting, as if it was all live ... bringing directors front and center with their films again,” explains Potter.

They chose to leverage their experience with real-time engines to bring that immediacy to the traditionally slow VFX pipeline. Because real-time engines render so quickly, directors are able to immediately see the impact of their changes, while enabling more spontaneous creativity and quick iterations on the stage floor. As Potter points out, “We go to the movies to see the work of these artists. We want to feel the director’s breath on each shot,” and these tools allow for just that.

Wes Potter, Founder and CEO, and Habib Zargarpour, COO

The project
The future today

With Unity, DMM created tools they call “Expozure,” “Cyclopz” and “Hermes.” Accessed on a customized handheld tablet or VR headset, they give the director ultimate control in CG production, letting them direct camera, lighting and set changes in real time on stage, rather than waiting for the final renders of the shots.

DMM’s Virtual Film Tools are used on some of Hollywood’s biggest features

DMM’s Virtual Film Tools are used on some of Hollywood’s biggest features

DMM’s Virtual Film Tools have been employed on some of Hollywood’s most successful recent films. The tools were used to make high-end, real-time filming possible on Disney’s Oscar-winner The Jungle Book, and most recently, Spielberg’s Ready Player One and on Oscar-winner Blade Runner 2049.

Because teams often have only a few minutes to design a new camera/object rig on the motion-capture floor to cater to the director’s wishes, they appreciate Unity’s flexible and rapid environment, aided by having an engineer on the stage floor. “It really collapses the iteration time,” says Potter.

Virtual cinematography hardware with Expozure as used on Blade Runner 2049

Virtual cinematography hardware with Expozure as used on Blade Runner 2049

For example, on Ready Player One, Potter had to rebuild a pipeline to accommodate a requirement of Steven Spielberg. He was able to quickly redo the entire thing live, on set, and have it ready to suit Spielberg’s timetable. Potter credits Unity for being able to do this quickly, because “it is such an adaptable engine.”

Customized hardware created for use on film sets

Customized hardware created for use on film sets

The reveal
Magic for teams

Leveraging Unity’s open and flexible architecture, DMM’s virtual cinematography environment lets directors and DPs experiment on the fly, as Potter explains: “On Blade Runner 2049, Villeneuve was able to re-envision shots for some of the digital scenes well after much of the editing was complete, creating a desired mood and tempo for the film.”

Criado em Unity, Expozure permite que o diretor altere a iluminação, as lentes e a encenação

With the DMM tools, directors like Spielberg and Favreau can direct their actors on the mocap stage because they can see them in their virtual forms composited live into the CG shot. They can even judge the final scene with lighting and set objects in detail. “It’s a holistic place for decision-making,” says Zargarpour.

What the director is seeing, either through the tablet or inside a VR headset, can be closer to final render, due to Unity’s graphical quality – which is light-years from where directors used to be – before real-time technology became part of the shoot.

The range of production touched by DMM’s Unity tool

The range of production touched by DMM’s Unity tool

Camera data is sent from the mocap stage to Unity, into DMM’s listening node called “Hermes” and onto Expozure.

Todos os departamentos ao seu alcance

Todos estes objetos estão sincronizados em todos os portais ou tablets DMM (por exemplo, Vive), para que todos possam participar da experiência.

Isso é pura magia para equipes trabalhando em uma pré-visualização. Caso uma pessoa mova um objeto, todos veem a mudança. O diretor pode animar a câmera, mover objetos e iluminação, fazer quase tudo enquanto tem cada departamento ao alcance.

E como a animação final da câmera gerada em Expozure frequentemente sobrevive à pós-produção até a edição final, cineastas estão começando a ver os impactos da edição em tempo real no futuro de suas profissões.

"É totalmente inevitável que você seja capaz de obter cenas de efeitos visuais com qualidade, finalizados, em tempo real", diz Zargarpour.

Está interessado em construir ferramentas com a arquitetura aberta e flexível de Unity?

Usamos cookies para garantir a melhor experiência no nosso site. Visite nossa página da política de cookies para obter mais informações.

Eu entendi