ADVERTISEMENT

Deakin Motion.Lab’s Real-Time Virtual Production Magnifies ‘Minibeast Heroes’

Viewers get a bug’s eye view of fascinating insects, with unforgettable commentary by a pint-size version of host Carl Smith in Australian Broadcast Company’s Minibeast Heroes. Created in collaboration with Melbourne-based Deakin Motion.Lab (and produced by ABC R&D) – the animated educational series (6 x 2’30″) were realized using a real-time virtual production workflow that allowed content to be captured and turned around in about three months. In addition to broadcast episodes, the team also produced a 360 degree trailer for the series (directed by the ABC’s Amy Nelson), which puts users inside the world with the enormous insects.

DML, a creative research consultancy based at Deakin University, collaborated with ABC Education and ABC R&D from the outset of the project, providing feedback on storyboarding and scripting in relation to motion capture and real-time virtual production. Once the concept was fleshed out, animation director Stefan Wernick (Armchair Productions) and ABC worked with Perth-based Pixelcase to generate the series’ main insect models by scanning real insects. DML then rigged the insect models and hand animated them for the virtual environments — which along with the host character were created by Wernick’s Armchair Prod.

The host was prepared for performance capture, with a head-mounted camera rig for facial capture and passive optical mocap markers for the body capture. His entire performance was solved in real time, streamed through the Unity game engine, translated onto his digital avatar, and integrated into the scene. This real-time virtual workflow allowed the director to visualize how the performance would translate onto the digital character and look in the environment via DML’s proprietary handheld virtual camera and 8m x 9m 3D projection screen.

“When directors and performers are able to clearly see what movements will look like on the final character and in the virtual environment, they can make more informed choices to shape the performance,” said Dr. Jordan Beth Vincent, a Research Fellow at DML who served as the lead on the Minibeast Heroes project. “That means the technology we use needs to be invisible, reliable and not interfere with the creative process.”

DML’s real-time workflow centers on a semi-permanent mocap stage that allows for a 7m x 8m x 6m shooting volume outfitted with OptiTrack Prime 41 and Prime 17W cameras. For the Minibeast Heroes shoot, a Facewear head-mounted camera was used for facial capture. Full body performance data and facial video recordings were fed into OptiTrack’s Motive software and Dynamixyz’s facial software, and then bundled with the audio recording and timecode sync by DML’s proprietary backend.

In general, the mocap feed is sent to either a game engine, third-party client or down the traditional post render route, depending on the desired end result. In the case of Minibeast Heroes, the data was solved on the characters straight out of Motive and retargeted in Autodesk Maya, where animation refinement was then completed.

“When starting a new mocap shoot for the day, we need everything to move quickly. Our OptiTrack system allows us to jump right into capture, rather than losing time in templating of talent. Calibration in Motive is also incredibly streamlined, we’re ready to go in five minutes at an incredible accuracy.” said DML’s Virtual Production Supervisor, Peter Divers.

Though only one performer was used for Minibeast Heroes, DML is able to accommodate full body tracking, including fingers, and facial for multiple performers and objects. Divers noted: “Last time we held a workshop, we had 12 people being tracked live before we ran out of markers. We’re pushing the limits of virtual production technology and having the support for research and development within the university environment gives us that freedom to innovate.”

Added Vincent, “As a research and development group, we’re always looking to apply our discoveries back to an industry context. That means we’re able to do things like bring real-time feature film tools and techniques to an animated educational children’s series. We had a lot of fun on this project, and our OptiTrack motion capture system is the foundation of it all.”

DML installed their OptiTrack system shortly before the November 2017 shoot for Minibeast Heroes, and after extensive research to replace their existing system. OptiTrack’s ease of use, data fidelity and the accessibility of system components were key motivating factors.

“We employ a lot of cameras to beat occlusions as much as possible, and Motive is able to handle the data with no problem. I’d say our shoots get us 80 percent of the way there, and then we employ our in-house automation tools to finish the job. We’re shooting at 180 FPS with very little movement between frames for optimal tracking,” Divers said. “When we go off site, we take the super wide angle Prime 17W cameras – you can see half the world with it.”

“We use our system to create, research and teach, and data clean up and post processing is part of our mocap course. Because data tracking with Motive is so easy, we’ve been able to employ some of our students who graduate as production-ready trackers and solvers. I also really like how open the OptiTrack hardware is and that the cameras use one standard PoE cable so we don’t have to wait on proprietary replacements. Our OptiTrack system makes our lives easier and we’re excited to explore using it for new applications,” concluded Vincent.

Check out the show here and get a glimpse behind the scenes in the making-of video below.

Minibeast Heroes
Minibeast Heroes
ADVERTISEMENT

NEWSLETTER

ADVERTISEMENT

FREE CALENDAR 2024

MOST RECENT

CONTEST

ADVERTISEMENT