Real-time filmmaking, explained
The real-time revolution has arrived in CG animation, previs, VFX and more, empowering directors and artists to complete storytelling projects that look closer to their creative vision. Read on to learn how it works.
What does real-time mean?
At its simplest, using a real-time engine means your project is ‘live’ as soon as you start production. You can experiment by making changes to things like characters, lighting, and camera positions as you work, and you don’t need to wait for any render farm to see what your content looks like. It’s all together, all in context—all the time.
The powerful technology beneath this originated from the gaming industry: games have to be rendered in real-time in order for the engine to respond (in milliseconds) to how a player is controlling the game. Unity is the most widely-used game development platform in the world, which also makes it the best choice for anyone wanting to harness real-time 3D content creation for other purposes, whether linear media, automotive design, architecture, or even space research.
The revolution is already here
It wasn’t long ago that influencers began predicting a fundamental change was coming. Now, with so many of the top studios and industry leaders already adopting and experimenting with real-time on projects big and small, it’s clear that this isn’t a distant dream or possible trend. This is the future of filmmaking.
Historically, there have been too many unnecessary barriers to creating CG content, whether due to technical limitations in hardware, the difficulty of changing highly customized pipelines and workflows already optimized over many years, or lack of organizational flexibility to try out alternative solutions. Fortunately, we have reached an unprecedented stage of technological advancement where anyone, anywhere can create this content without the slow, laborious, and often intimidating challenges of offline processes. The only barrier now is learning what this shift means for your studio—and the willingness to make the leap.
Forbes’ Charlie Fink wrote in 2017: “There is something very important going on with Unity and Hollywood that is going to impact media production dramatically in the coming years, as game production techniques begin to supplant hundred-year-old narrative entertainment production techniques. This is going to deeply disrupt the film and television industry as we know it. Filmmakers, cinematographers, designers, and technicians have a great opportunity to harness this growing disruption... Film schools not teaching their students how to build worlds and make movies with Unity are committing malpractice.” (Read the full article on how Unity is disrupting film and TV production.)
Always moving forward
Unlike the ‘handoff’ or ‘sequential’ approach, working in real-time means everyone iterates together—from the beginning—and moves continually toward final frame.
A new workflow for CG animation
One of the biggest differences in using a real-time platform as the creative center of your pipeline is that all departments are able to begin work immediately [(1), fig. above], simultaneously, and collaboratively. That in and of itself brings numerous invaluable benefits:
- What you formerly thought of as ‘pre-production’ is actually now just ‘production.’ Because your project is live from the beginning, you can remove and replace temporary animations and shots with updated or refined versions as you are iterating [(2), fig. above], whenever they are ready, without needing to throw out work or pre-visualizations entirely and starting from scratch again in a new phase of production. It’s not unusual to have even some pre-visualizations themselves make it into final shots, as they can often be a useful foundation that carries all the way through. Some directors prefer to begin even without storyboarding, since building out a world first and then finding the right shot organically to tell your story often feels more natural and inspiring.
- Review cycles, feedback loops, and speed of iteration are so much improved that productivity can double—allowing you to meet your goals in half the time. Directors, no longer forced to remain several days or even weeks behind artists’ work (by the time renders come in for review), are able to lean over and look at a scene’s progress at any time [(3), fig. above]. This means feedback itself is in real-time. Similarly, artists are empowered with up-to-date direction and the ongoing ability to make sure they are meeting the creative vision of their leadership team. Without time delays between iterations, decisions are made faster, which means fewer waste in precious resources that would have been spent going down the wrong creative paths.
- Experimentation is risk-free. Let’s start with an example: when writer/director Neill Blomkamp (District 9, Elysium) and VFX Supervisor Chris Harvey (TRON: Legacy, Zero Dark Thirty) were creating “Adam 2: The Mirror,” a particular outdoor scene set up in midday was causing them to struggle with lighting that wasn’t quite right for the look of the scene. Finally Harvey suggested an idea to Blomkamp: give him one hour, and he’ll come back with more choices. He then worked with his lighting team to take the same 30 seconds of the shot lit in four different times of day (2:00, 4:00, and so on). He came back to Neill, they agreed on the strongest, and moved forward with it for the final version. This would never have been possible without real-time technology, as it would have taken far too long to create unnecessary versions of the scene and ultimately too much of a luxury because of the cost. Yet in Unity, all of this is free, fast, and flexible enough to accomplish even as a quick experiment over lunch.
Finally, the project is always moving forward [(4), fig. above], even if creative direction changes. Unlike traditional workflows, where a significant change like camera angle or character swap might send you back to square one to reset the layout or art from scratch, a real-time production simply adjusts in the moment and departments carry on working. If you need to shoot something from a different angle, change the scale of props in an environment, or even revisit the entire look and feel or style of your art, it’s all in real-time 3D, preventing any ripple effects and any need to return to square one. Every day of work, and every decision made, is propelling you and your team to your ultimate vision and final frame.
The positive impact real-time work has on teams
Another significant difference between teams that create in real-time and teams that don’t is the overall culture boost when artists can truly collaborate, participate, and feel more ownership of the project as a whole. These benefits extend beyond individuals and can result in a studio-wide increase in creative satisfaction:
- What used to be more of a factory-line handoff is now an open seat at a large, round table. It’s not uncommon for artists in offline workflows to feel isolated, lonely, or insignificant. Being stuck in a silo is never as inspiring, desirable, or rewarding as actively partnering with teammates to ideate, brainstorm, or spitball.. Because there’s no step-ladder restriction anymore when it comes to workflows in a live project, every artist, technical director, specialist, department head, or line producer can show up to add creative value and contribution, bringing their own unique eye and experience at any time to any situation that requires problem-solving or creative exploration.
- ‘Upstream’ and ‘downstream’ skillsets formerly separated by time and process can now work simultaneously and even physically together. An animator, for example, can work with a lighting artist [(5), fig. above] to see how various artistic directions for a character would play out in context with the fully lit scene. When artists can complement (and compliment!) each others’ work, it naturally leads to a better quality product in the end—but more importantly, it leads to healthy professional partnerships and deeper relationships. What’s more, it increases the likelihood of those ‘happy accidents’ and magic moments when unplanned creative forces come together in such a way that a brand new story element or clever opportunity emerges that is simply impossible to resist.
- Artists can pitch their own ideas (and do so quickly and easily), which leads to a sense of empowerment. There are two reasons in particular why artists can test out hunches and concepts for themselves or others, in a way they can’t in traditional offline workflows. First, as the creative center of your pipeline, Unity allows artist to bring their assets all the way into the contextual scene, and even to create mini-moments themselves using the storytelling tools, which makes it easier to paint the picture for the director. Second, because this is less time-intensive of a task (and because they aren’t dependent on technical staff to assist in doing so), they can do so without disrupting their daily work or milestones; they can pitch more ideas, which empowers the director with additional perspectives as well as raises the value of contribution from his or her artist talent.
- Individuals can see the impact of their work in the larger context of the overall story. This fosters a new kind of work environment: one that becomes collective, motivating, and more meaningful as production progresses. When directors make requests to artists, it’s no longer something that has to be taken on faith or without understanding, because the artist can view the work too and understand why and how it fits together (or doesn’t). When a story is the result of a thriving community of creative, passionate team members, even the audience can tell.
Free yourself from render farms
One of the most powerful aspects of working in real-time is that, when you decide you have reached your vision or when the time has come to deliver your story, a final render is only a click away. Not only can the Unity engine now produce the right visual fidelity to match the aesthetic style of your creative vision, but the frame recording feature allows you to capture footage right away, in a variety of desired formats (even up to 8K).
In a world where directors are used to the idea of single frames taking minutes if not hours to render, the idea of hitting ‘record’ and seeing the editor popping out an entire short film before you’re even back from grabbing coffee might seem too good to be true. This is the power of real-time filmmaking, the future of content creation in a world where storytellers are not restricted by technology but empowered by it—a world where the only limitation is creativity and imagination. This is how it should be.
It should also be noted that, for studios who prefer to remain with their custom or proprietary offline renderers for a final render solution, Unity also has FBX exporting functionality to export geometry and record cameras and animations. The editor is highly customizable and can be plugged into your rendering pipeline in various ways. On the journey to final render, however, a real-time workflow can always be leveraged until that moment arrives, enabling teams to benefit from all the above opportunities along the way.
ADAM: Episodes 2 and 3
These two short films were created by Neill Blomkamp (District 9, Elysium), VFX supervisor Chris Harvey (TRON: Legacy, Zero Dark Thirty), and a group of award-winning designers, artists and engineers. In just five months, the Oats team produced in real-time what would normally take a year using traditional rendering.
The first cartoon series produced for television to be made in Unity, Mr. Carton uses a unique style of cardboard photography to tell the story of a clumsy driver attempting to reach a mountaintop. The full series, which comprises 13 two-minute episodes, was created by Michaël Bolufer, who was also the co-director, designer, co-scenarist, main lighter, and animator.