Industrial Light & Magic (ILM), and Epic Games (maker of the Unreal Engine), together with production technology partners Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI unveiled a new filmmaking paradigm in collaboration with Jon Favreau’s Golem Creations to bring The Mandalorian to life. The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using real-time game engine technology and LED screens to represent dynamic photo-real digital landscapes and sets with creative flexibility previously unimaginable.
Over 50 percent of The Mandalorian Season One was filmed using this ground-breaking new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20’ high by 270-degree semicircular LED video wall and ceiling with a 75’-diameter performance space, where the practical set pieces were combined with digital extensions on the screens. Digital 3D environments created by ILM played back interactively on the LED walls, edited in real-time during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by NVIDIA GPUs. The environments were lit and rendered from the perspective of the camera to provide parallax in real-time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Jon Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, and cinematographers Greig Fraser and Barry “Baz” Idoine, and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve real-time in-camera composites on set.
The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of partners such as Golem Creations, Fuse, Lux Machina, Profile Studios, and ARRI together with ILM’s StageCraft virtual production filmmaking platform and ultimately the real-time interactivity of the Unreal Engine platform.
“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of real-time, in-camera rendering,” explained Jon Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”
“Merging our efforts in the space with what Jon Favreau has been working towards using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” said Rob Bredow, Executive Creative Director and Head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real-time on stage providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”
Richard Bluff, Visual Effects Supervisor for The Mandalorian added, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”