When Disney Television Animation’s R&D organization approached Unity to develop three shorts featuring “Big Hero 6 The Series” fan-favorite character Baymax, using a real-time engine as the foundational production technology, the two assembled a team of talented artists and engineers scattered across the globe, and cast feature-film director Simon J. Smith (Penguins of Madagascar, Bee Movie) to lead the charge.
Using their backgrounds in film, visual effects, games, architecture and virtual reality, the team challenged itself to build these stories in a new way. “We really didn’t know how this was going to go,” says Smith. “None of us had done anything like it with a 3D engine.”
The storytelling process developed by the team resulted in more creative iteration throughout the production, almost like working on a virtual film set. “The flexibility and the immediacy of the engine is, to me, the biggest value,” says Gino Guzzardo, Director of Multi-Platform Content for Disney Television Animation.
“Baymax Dreams” – which premieres on DisneyNOW and Disney Channel YouTube on September 15 (with a sneak peek at SIGGRAPH 2018), and is inspired by Disney Television Animation’s “Big Hero 6 The Series” TV show – explores the content of Baymax’s dreams, whether he’s counting electric sheep or dodging virtual bedbugs.
Enter Timeline and Cinemachine
Each two-minute short required the team to invent and experiment to find the best “dream” to tell. For this, Guzzardo was able to iterate with Smith by employing a combination of Unity’s multi-track sequencer (Timeline) and smart cameras (Cinemachine), allowing the team of 15 to try out ideas and iterate quickly, while still creating high-quality work.
“It’s exciting to see this new tech being used to tell a story about a robot that represents the best in technology,” says Mark McCorkle, Executive Producer of “Big Hero 6 The Series.”
Baymax’s look was developed by John Parsaie, Unity Graphics Engineer, who achieved materials like Baymax’s emissive “night-light” glow using the High-Definition Render Pipeline (HDRP). Artist Keijiro Takahashi and Unity’s Graphics team also collaborated on effects like voxelization.
“At its core, the experiment was meant to explore efficiencies using real-time technology,” explains Andy Wood “Baymax Dreams” Line Producer, and that began with story development.
After each episode was imagined in script form, it was put into Unity, skipping the traditional storyboarding process. “We essentially went straight into previsualization in Unity, but instead of throwing that work away, it became our foundation – our edit – and we used cameras created on day one of production up through the final renders,” says Wood.
And there was no more render farm in this workflow. This is a benefit of Unity's solution for film. “Compositing was also happening from the start, allowing the team to finish shots in the same environment they were designed in,” continues Wood.
Creating a virtual stageplay
The engine gave the team the freedom to create a “virtual stageplay,” where they could view a scene from any angle, at any moment. This provided flexibility to creatives like Simon and Assistant Director Mark Droste to explore and iterate on how to shoot a scene extremely fast.
“Working in real-time on a project in Unity is akin to working on a live-action set, but more so,” says Smith. “You can change the lighting, adjust the sets, move the characters, blend animation to discover new performances, change the music, the sound effects and the cameras instantly.”
The cameras, for example, would round-trip between Mark and the animators, who would start to polish the performances as both the cameras and the edit were refined. “By being in the set with cameras sooner, we had a better idea of how a character might react in a space,” says Droste.
In turn, performances created by the animators could also be controlled through Unity’s Timeline, allowing comic timing to be adjusted inside a shot in real-time.
Notes from Dailies could be fixed on the spot by a single artist, instead of having to wait days, or even weeks, for a note to go through several steps of a waterfall production.
“It’s an interactive story feedback loop that gives you more educated choices much, much sooner,” says Smith.
Disrupting the waterfall
Starting all departments at once meant that no one on the team was left idle. Lighting worked in parallel with Animation, which worked in parallel with Camera and Layout.
“That is the wonderful thing about real-time content creation – it allows multiple departments to work in the same context simultaneously,” says Michael Breymann, “Baymax Dreams” Technical Supervisor.
In fact, as the production evolved, each department’s work was viewed at the same time at every approval stage. Instead of finishing work and handing it off to the next department, work was viewed holistically in an edit, emulating more of a bullseye target than a linear waterfall model.
Multiple artists on the same beat
By setting up each episode with nested Unity Timelines and Prefab assets, multiple artists could work in one beat without overwriting each other’s work.
“Being able to immediately see how your animation looked in the scene with next-to-final lighting was a real plus,” says Bryan Larson, “Baymax Dreams” Supervising Animator. “Adding touches like Baymax squinting when he stepped out into the light were simple to do because of that quick turnaround.”
“It was obvious that the versatility was going to bring creative rewards. It helped us get over major hurdles, enabling us to do more things,” says Smith. While reflecting on the process of creating the “Baymax Dreams” shorts he says, “Appropriately, it was like a superhero discovering his superpowers for the first time.”
Thanks to the power of real-time, the storytelling ethos has been advanced by the team, and in a fraction of the time of a typical animation production. “Disney Television Animation is committed to technological innovation and we love the quality that it brings to our stories,” says Guzzardo.