Last evening the National Academy of Television Arts and Sciences (NATAS) presented the inaugural “Excellence in Production Technology” Emmy® Award to The Santa Clauses Season Two. The award was presented as part of the 75th Technology & Engineering Emmy® Awards at the Prince George Ballroom in New York, hosted by David Pogue, Emmy award-winning correspondent, CBS Sunday Morning.
Rachel Rose, ILM Research & Development Supervisor said, ”The team at Industrial Light & Magic are incredibly honored to be recognized by the Television Academy with an Emmy Award for our innovative StageCraft technology and the advancements made for ‘The Santa Clauses, Season Two.’ This recognition is a testament to the hard work and dedication of our exceptional team of technologists, artists, and production crew.”
Stephen Hill, Matthew Lausch, Industrial Light & Magic accepting the inaugural “Excellence in Production Technology” Emmy® Award. [Photo Credit: Joe Sinnott for NATAS]
Read the complete National Academy of Television Arts and Sciences press release here.
The National Academy of Television Arts & Sciences (NATAS) today announced the recipients of the 75th Annual Technology & Engineering Emmy® Awards and introduced a new category for the 75th Technology & Engineering Awards, called “Excellence in Production Technology,” and named three nominees including ILM for its work on The Santa Clauses, Season 2, along with two non-ILM projects.
NATAS President and CEO Adam Sharp said: “This new category honors innovations that significantly enhanced the experience of broadcast viewers during the competition year. These nominees revolutionized the way television is produced, delivered, or broadcast. Together, they help set the standard for this exciting new track of recognition in the Tech Emmys.”
“As we honor the diamond class of the technology Emmys, this class typifies the caliber of innovation we have been able to enjoy for the last 75 years. Congratulations to all the winners.” said Joe Inzerillo, Co-Chair, NATAS Technology Achievement Committee.
The Technology & Engineering Emmy® Awards are awarded to a living individual, a company, or a scientific or technical organization for developments and/or standardization involved in engineering technologies that either represent so extensive an improvement on existing methods or are so innovative in nature that they materially have affected television.
The ceremony will take place Oct. 9 at the Prince George Ballroom in New York.
For the first time, ILM’s groundbreaking virtual production technology transports fans inside the Star Wars galaxy.
By Clayton Sandell
Patricia Burns gets ready for her closeup on the ILM StageCraft volume at D23.
Patricia Burns steps up to her mark.
Dressed in the sleek all-black uniform worn by the Third Sister Reva Sevander from Obi-Wan Kenobi (2022), she ignites her doubled-bladed red lightsaber and waits for her cue.
A nearby stagehand counts her down and calls “Action!”
As a crane-mounted camera swoops in, Burns crouches next to R5-D4, a red and white astromech droid, swinging her lightsaber with a fierceness only a Jedi-hunting Inquisitor could conjure. Behind her, a massive wall of LED screens displays the pristine moving image of a busy Rebel hangar.
Monitors around the stage show what the camera sees in real-time: an epic, trailer-worthy shot that makes Burns the star of her own Star Wars story.
“Oh, it was awesome,” Burns tells ILM.com as she walks off the stage, grinning. “A chance of a lifetime.”
For the first time ever, the ILM crew assembled a volume— something normally sequestered on an off-limits studio soundstage— inside the Anaheim Convention Center just for fans attending D23.
ILM’s chief creative officer Rob Bredow and virtual production supervisor Sonia Contreras host a StageCraft workshop.
“I think everybody is blown away by the scale of this, and how immersive it actually is when you get to see it here on the show floor,” says Rob Bredow, senior vice president, creative innovation for Lucasfilm and chief creative officer of ILM.
During the three-day event, a rotating trio of scenes appeared on the volume’s giant LED panels: an Imperial hangar created for The Mandalorian, a Rebel hangar from Ahsoka (2023) and a vibrant city street on the planet Daiyu seen in Obi-Wan Kenobi.
“You’re looking at over 18-and-a-half million pixels of LED wall and a live-tracked camera,” Bredow tells ILM.com. “Wherever the camera looks, we get a high-fidelity version with exactly the right perspective for the illusion of creating an immersive environment. It looks impressive enough here at the convention center but when we collaborate with the production designer and the art department on one of our productions that’s when the technology really sings. It’s a powerful tool in the filmmaker’s toolbox that we can deploy when building standing sets on a stage or traveling the cast and crew to a far-flung location isn’t feasible.”
For D23, ILM wanted to demonstrate a fully functioning StageCraft volume exactly like the ones used on a real set.
“It’s very fun to not be faking it,” Bredow quips.
Attendees at D23 take in ILM’s StageCraft volume.
ILM virtual production supervisor Ian Milham says transporting the volume from a studio lot to the convention center took a herculean scheduling and logistical effort involving a busy team of artists, engineers, and crew members. And several large trucks.
“Everybody agreed, ‘yes, we’re really going to do it’,” Milham explains. “But that meant we had to get our real gear and our real crew here. It also meant we couldn’t be making a movie with it at that time.”
The challenge was worth it, Milham says, because it gave the filmmakers a chance to finally show off their pride in StageCraft to a wider audience.
“Film sets are amazing places,” says Milham. “But it’s not like there’s a lot of chances to really share our success. So we’re really happy to be able to show the public for the first time the cool results, but also what it takes to pull off something like this and how much teamwork and technology it takes to do it.”
ILM virtual production supervisor Ian Milham demonstrates the volume.
ILM virtual production supervisor Sonia Contreras co-hosted several StageCraft presentations with Bredow. The pair challenged the D23 audience to look at several scenes and guess which elements were created with practical set pieces and props, and which ones were generated by the volume.
“I got about a third of them right,” laughs Ryan Schwartz, who watched the demonstration with his wife Katie and sons Zachary and Jonathan. Katie says she fared slightly better, guessing about half correct.
“I’ve been following ILM for a long time, and I still try and figure it out,” Ryan tells ILM.com. “They’re so amazing in their craft that it’s so hard to really piece together what is real and what is digitally done.”
Contreras says the D23 StageCraft experience is extremely special because even some ILM employees still haven’t been able to see the volume work in person.
“I would hope that people take away that there’s a lot of brains that go into making this happen,” Contreras says, pointing to the setup’s real-time rendering, camera tracking, processing power, and an aptly named “Brain Bar” crew working behind the scenes to help make the scenery so seamlessly realistic.
“The ‘wow’ factor is when you get to see what’s actually happening, all the different things that are getting coordinated in order to make that image work,” Contreras says. “It’s really cool to be able to show it to everybody.”
Lucasfilm senior vice president and executive design director Doug Chiang made a special appearance in front of a packed crowd on Saturday to talk about StageCraft’s contribution to the long history of visual effects filmmaking.
“We rarely get to share it or talk about it, because it’s an evolving technology, and it is just a tool,” says Chiang. “But at an event like this, where we can actually finally get under the hood and share the magic with the audience, it’s just terrific.”
Lucasfilm executive design director Doug Chiang and Lucasfilm Art Department associate producer Michelle Thieme in the volume.
Frequent ILM collaborator Legacy Effects also pulled the curtain back to show how their crew helps create a Star Wars galaxy full of creatures, aliens, and droids.
“When you’ve got leaders like Jon Favreau and Dave Filoni, who just embrace everyone’s contributions, it inspires you to do the best work that you can,” says Legacy Effects co-founder and special effects veteran Alan Scott.
At D23, Scott and the Legacy team explained how they bring life to characters like the silver professor droid Huyang (voiced by David Tennant) and Murley the Loth-cat for Ahsoka. The production relies on a combination of practical puppets along with digital versions inserted later, depending on the requirements of each shot.
“There are things that I think practical can do very well, especially when it comes to the interaction with the performers,” Scott tells ILM.com. “Then there’s a responsibility that says, ‘that would be better if it was done with visual effects.’”
Legacy Effects co-founder Alan Scott (left) demonstrates a character prop with colleagues Dawn Dininger and David Covarrubias.
Bredow hopes that revealing how some of the Star Wars magic is made might inspire others, especially kids, to consider working in visual effects.
“Many people don’t even realize there are these very artistic and very technical and very creative jobs that have to do with working behind the scenes of film and television production,” Bredow explains. “So this is one of the fun things to do. To connect with fans, to connect with people who might want to make this a career.”
Cosplaying as Bastila Shan from the Knights of the Old Republic (2003) video game, Star Wars fan Carly King says she was most impressed by StageCraft’s powerful mix of creativity and engineering.
“It just looked so good on the screen. It’s so interesting to see how this whole conglomeration of electronics and technology comes together. It’s an incredible thing,” King says. “It’s one thing to watch Star Wars, but it’s another thing to be in it.”
—
Clayton Sandell is a television news correspondent, a Star Wars author and longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design.
On February 23, 2024 The Academy of Motion Picture Arts and Sciences will recognize 16 technologies for their impact on filmmaking. Two technologies that ILM played a key role in helping to develop will be among those recognized.
SciTech Awards committee chair Barbara Ford Grant said “this year, we honor 16 technologies for their exceptional contributions to how we craft and enhance the movie experience, from the safe execution of on-set special effects to new levels of image presentation fidelity and immersive sound to open frameworks that enable artists to share their digital creations across different software and studios seamlessly.”
Former ILM engineers Christopher Horvath and Joe Ardent are being recognized alongside Lucas Miller and Steve LaVietes for the AlembicCachingand Interchange system. Alembic began as a collaborative effort between ILM and Sony Pictures Imageworks to solve the problem of algorithms for storing and retrieving baked, time-sampled data enable high-efficiency caching across the digital production pipeline and sharing of scenes between facilities. The two companies would open-source the project and interchange library in 2011. Since then, Alembic has seen widespread adoption by major software vendors and production studios.
ILM’s Dan Bailey joins Jeff Lait, and Nick Avramoussis for the continued evolution and expansion of the feature set of OpenVDB. Core engineering developments contributed by OpenVDB’s open-source community have led to its ongoing success as an enabling platform for representing and manipulating volumetric data for natural phenomena. These additions have helped solidify OpenVDB as an industry standard that drives continued innovation in visual effects.
Unlike other Academy Awards® to be presented this year, achievements receiving Scientific and Technical Awards need not have been developed and introduced during a specified period. Instead, the achievements must demonstrate a proven record of contributing significant value to the process of making motion pictures.
Before this announcement, 34 ILM technological achievements had been recognized with Scientific and Technical Achievement Awards. This latest recognition continues a legacy of technical innovation dating back to the mid-1970s.
The end is only the beginning. KISS have been immortalized and reborn as avatars to rock forever. Created by Industrial Light & Magic (ILM) in collaboration with the band and Pophouse Entertainment Group, the avatars portray each of the four band members in an idealized, and at times superhuman form. Months before the supergroup’s final show which would take place on December 2, 2023, KISS joined ILM’s visual effects team at its San Francisco headquarters to get measured, scanned, and photographed before slipping into sleek motion capture suits so the crew could record every nuance of their final performance. ILM’s StageCraft virtual production team would then simultaneously capture each band member’s performance from their facial expressions to their fingertips as they played in “God Gave Rock ’N’ Roll to You II” in unison.
The KISS avatars showcase ILM’s unique creative expertise and artistry using their advanced performance-capture technology. The team was led by Academy Award® nominated Visual Effects Supervisor Grady Cofer. Cofer has over 20 years of experience supervising groundbreaking visual effects projects. Cofer is currently nominated for an Emmy Award for Outstanding Special Visual Effects in a Season for his work on The Mandalorian. Prior, he served as Overall Visual Effects Supervisor on Space Jam: A New Legacy and earned an Academy and BAFTA nomination for his contributions to Steven Spielberg’s Ready Player One. Cofer’s three-year collaboration with Spielberg utilized cutting-edge virtual production tools to bring the OASIS, the project’s vast virtual world, to the big screen.
“This is the sneak peek as the band crosses over from the physical world to the digital. We want to give fans a sense of the many forms this band could take in the future.”
Grady Cofer, ILM visual effects supervisor
Cofer’s ILM team leveraged the company’s decades-long experience to push the capabilities of performance capture, gathering every nuance of KISS band members’ face and body performance in exacting detail. This data would in turn become the basis for the motion of the band’s virtual avatars. The raw facial capture data was processed in real-time via ILM’s advanced machine learning algorithms for instantaneous feedback on stage and later passed through the ILM pipeline to be augmented by the artists to ensure the resulting performances were exactly as the band intended for their new digital personas enabling KISS’s creative output to continue to enthrall audiences well into the future.
As the band’s final concert drew to a close, lead singer Paul Stanley’s avatar proudly exclaimed “KISS Army, your love, your power has made us immortal! A new KISS era starts now.” The digital group then performed its hit single “God Gave Rock ’N’ Roll to You II” to the delight of the concertgoers who filled the sold-out Madison Square Garden.
The filmmaker and Lucasfilm legend talks to ILM.com to reflect on what drew him to tell the story of the hit Disney+ series, “Light & Magic”.
Screenwriter and director Lawrence Kasdan.
How did you get involved with Light & Magic? Several years ago my wife and I made a short documentary about a little diner that we used to eat at all the time that suddenly closed. In making that documentary with her, and cutting it with terrific people, it made me realize how much I liked the documentary format. I had never done that. We set out to meet some documentary people and I met Justin Wilkes at Imagine Entertainment. He asked me what I was interested in doing and I suggested a history of visual effects, because even though I had been around visual effects throughout my career, it occurred to me that I didn’t know much about them. The second thing that interested me were the people of Industrial Light & Magic that I had been working around for over forty years. So we both agreed that that would be a great story to tell: the history of visual effects, and the personal stories of these people. What drove these people, what was their life like, what made them want to stay at ILM as long as they did? Everyone loved the idea, so we went to work.
Lawrence Kasdan, center, on the set of Star Wars: The Empire Strikes Back.
What was your vision for the documentary? From my very first film until today, I’ve always considered myself a humanist filmmaker. I’m interested in what happens between people, and why people make certain decisions in their lives. What chance is involved? What fate? What luck? So from the very beginning of this I was interested in learning what brought these people to this work. What were the relationships that they made when they arrived? Why did they continue to work there much longer than they expected, some for nearly half a century? What has all that meant to these amazing advancements in technology? It’s about people, and their gifts, and out of those gifts came technological advancements that boggle the mind.
Dennis Muren, left, and Phil Tippett, right, review images with Joe Johnston.
Why did you think this story should be told? Because it’s great to see artists at work. The commitment of great craftsmen. I love to see people that have mastered a skill, and try to make it better, and don’t settle. I think it’s great to see expertise and this pure devotion to discipline, and that is always a good story to see. Dennis Muren, left, and Phil Tippett, right, review images with Joe Johnston.
John Dykstra and a fleet of miniature TIE, X-wing, and Y-wing starfighters.
How did you approach the research, and what resources did you use? We had a fabulous team that Imagine Documentaries put together, some internal to the company, and some that were freelancers. They really knew their stuff, so it was a great luxury for me as a director. There were so many things that I wanted to ask during interviews, but the input from this incredible group of producers and writers and editors stimulated me all of the time to go in different directions during interviews.
ILM’s Paul Huston and Larry Tan on the set of Star Wars: Return of the Jedi.
For those that have yet to watch it, can you tell readers what the timeline of the series is? Over the six hours we see the very birth of ILM, what happened as it came together during the production of Star Wars: A New Hope, and then off of the success of that film, how it was launched into a nearly fifty-year enterprise. We mainly follow it chronologically, but we do jump around a bit to serve the story. Part of the kick for me was that we had such a trove of archival footage, so these people might be talking about something from forty or fifty years ago, and we had stills from that moment in their career. It was incredible to be able to cut from one to the other across time, to hear them talking about a problem, and then see footage of them finding a solution. A huge part of ILM’s legacy is finding solutions to problems.
Peter Kuran, Rose Duignan, and George Lucas review effects shots for Star Wars: A New Hope.
How did you select the filmmakers that were featured in the documentary? They are all giants, and they have all used ILM in the most expressive and innovative ways. They put pressure on themselves and then turned to ILM and said, “can you do this? Can you create something for me that I have never seen before?” ILM would always say yes. And sometimes it might be a struggle, and sometimes it might be a long process, and sometimes it might be an instantaneous solution where one of these genius people that work there would say, “I know what we could do”. These are major filmmakers that have contributed to the zeitgeist. Jim Cameron, Steven Spielberg, Bob Zemeckis, J.J. Abrams, and at the heart of it, of course, is George Lucas.
Lawrence Kasdan and J.J. Abrams on the set of Star Wars: The Force Awakens.
What was the most interesting thing you learned throughout the process of creating Light & Magic? I think I learned what goes into creating something new, working with people you respect and depend on, and how this personal relationship then impacts the professional work. There is something beautiful about the generosity of the people that work at ILM, and through that generosity they are able to discover new frontiers and break new grounds that no one has ever been able to do.
All episodes of Light & Magic are streaming now on Disney+.
ILM | A legacy of innovative and iconic storytelling.
The Television Academy today announced the recipients of the 74th Engineering, Science & Technology Emmy® Awards honoring an individual, company, or organization for developments in broadcast technology. Industrial Light & Magic is proud to be the recipient of an Emmy Award for its StageCraft™ virtual production tool suite. StageCraft has been used on such series as The Mandalorian, The Book of Boba Fett, How I met your Father, Obi-Wan Kenobi, and The Old Man.
“Innovation is a vital part of television production; and the talented engineers, scientists and technologists we have recognized are essential to the growth of our industry,” said Frank Scherma, chairman and CEO of the Television Academy. “These pioneering companies and visionaries have leveraged the power of technology to elevate television and storytelling in fundamental ways.”
ILM StageCraft is an end-to-end virtual production tool suite that bridges the gap between practical physical production methodologies and traditional digital post-production visual effects by providing the ability to design, scout and light environments in advance of the shoot and then capture that vision in camera during principal photography. StageCraft brings together a real-time engine, a real-time renderer, high-quality color management, physical camera equipment, LED displays, motion-capture technologies, synchronization methodologies and tailored on-set user interfaces to digitally create the illusion of 3D backgrounds for live-action sets.
“Earlier this year the Academy formed the Science & Technology Peer Group representing members who are involved in the strategy and development of technologies that enable or advance the storytelling process for the television industry,” said Committee Chair John Leverence. “Under the leadership of the new peer group’s governors and co-chairs Wendy Aylsworth and Barry Zegel, this year’s newly constituted Engineering Emmy Awards Committee honors a wide range of innovative solutions to once seemingly intractable technical problems.”
Six-Part Docuseries Debuts Exclusively on Disney+ July 27
Disney+ released the trailer and key art for Lucasfilm and Imagine Documentaries’ “Light & Magic,” an immersive series that chronicles the untold history of world-renDisney+ released the trailer and key art for Lucasfilm and Imagine Documentaries’ “Light & Magic,” an immersive series that chronicles the untold history of world-renowned Industrial Light & Magic (ILM), the special visual effects, animation and virtual production division of Lucasfilm.
Granted unparalleled access, Academy Award®-nominated filmmaker Lawrence Kasdan takes viewers on an adventure behind the curtain of Industrial Light & Magic. Learn about the pioneers of modern filmmaking as we go on a journey to bring George Lucas’ vision to life. These filmmakers would then go on to inspire the entire industry of visual effects.
The series is directed by Lawrence Kasdan, and the executive producers are Ron Howard, Brian Grazer, Justin Wilkes, Lawrence Kasdan, Kathleen Kennedy and Michelle Rejwan.
All six episodes of “Light & Magic” premiere on July 27, exclusively on Disney+.
Disney+ is the dedicated streaming home for movies and shows from Disney, Pixar, Marvel, Star Wars, and National Geographic, along with The Simpsons and much more. In select international markets, it also includes the new general entertainment content brand, Star. The flagship direct-to-consumer streaming service from The Walt Disney Company, Disney+ is part of the Disney Media & Entertainment Distribution segment. The service offers commercial-free streaming alongside an ever-growing collection of exclusive originals, including feature-length films, documentaries, live-action and animated series, and short-form content. With unprecedented access to Disney’s long history of incredible film and television entertainment, Disney+ is also the exclusive streaming home for the newest releases from The Walt Disney Studios. Disney+ is available as a standalone streaming service or as part of The Disney Bundle that gives subscribers access to Disney+, Hulu, and ESPN+. For more, visit disneyplus.com, or find the Disney+ app on most mobile and connected TV devices.
San Francisco and Vancouver–Production is underway in Vancouver on the ambitious upcoming Disney+ Original series Percy Jackson and the Olympians, based on Rick Riordan’s best-selling novels, on a newly built state of the art Stage Craft LED stage, the first of its kind in Canada. The stage was built through a partnership with Industrial Light & Magic (ILM) and 20th Television which is producing the eagerly anticipated Disney Branded Television series for Disney+.
Explained by executive producer and author Rick Riordan, “The story of Percy Jackson has such an epic scope, I was crossing my fingers we would be able to partner with Industrial Light & Magic. “That was really the only way to do the adaptation justice and bring our visions to life. I am over the moon that we have forged such a great relationship to give this show such a cutting-edge look and feel. I’m sure the Olympian gods would expect nothing less!”
“The 20th Television team and the series producers clearly saw the value that ILM StageCraft brings to a production and understood it to be a perfect fit for a series like Percy,” said Chris Bannister, executive producer, ILM StageCraft. Jeff White, creative director for ILM’s Vancouver studio, added, “With ILM’s StageCraft technology we allow filmmakers to design, light, and shoot the digital world as they would in the practical world all integrated in front of the cast and crew on stage. It opens up an amazing range of possibilities right before their eyes.”
“Working with the team at ILM has been a dream,” said 20th executive vice president of Production Nissa Diederich. “The fans of this franchise have high expectations for the series and we knew that we needed the most advanced production technology available, and who better to partner with than Industrial Light & Magic? The stage we have built will be home to Percy and potentially dozens more of our most ambitious series. It really says to our creators, the sky’s the limit – if you can dream it, we can shoot it.”
Based on Disney Hyperion’s best-selling book series by award-winning author Rick Riordan, “Percy Jackson and the Olympians” tells the fantastical story of a 12-year-old modern demigod, Percy Jackson, who’s just coming to terms with his newfound divine powers when the sky god Zeus accuses him of stealing his master lightning bolt. With help from his friends Grover and Annabeth, Percy must embark on an adventure of a lifetime to find it and restore order to Olympus.
The series will star Walker Scobell as Percy Jackson, Aryan Simhadri as Grover Underwood and Leah Sava Jeffries as Annabeth Chase. Previously announced guest stars include Virginia
Kull as Sally Jackson, Glynn Turnman as Chiron aka Mr. Brunner, Jason Mantzoukas as Dionysus aka Mr. D, Tim Sharp as Gab Ugliano and Megan Mullaly as Alecto aka Mrs. Dodds.
Riordan and Jon Steinberg serve as writers of the pilot, and James Bobin directs. Steinberg oversees the series with his producing partner Dan Shotz. Steinberg and Shotz also serve as executive producers alongside Bobin, Riordan, Rebecca Riordan, Bert Salke, Monica Owusu- Breen, Jim Rowe, Anders Engström, Jet Wilkinson and The Gotham Group’s Ellen Goldsmith- Vein, Jeremy Bell and D.J. Goldberg.
Gareth Edwards on the set of Rogue One: A Star Wars Story.
Join Gareth Edwards and the Publicity Group at Industrial Light & Magic as we look back at his time directing Rogue One: A Star Wars Story. Gareth discussed the cutting-edge virtual production used for the film, and the ways in which George Lucas inspired him as a filmmaker.
Tell me about the freedom you found in the virtual production-aspects of Rogue One? John Knoll was very crucial for this, because he and the team at ILM devised a virtual environment where we could go in and look for shots. My entryway into filmmaking was through visual effects, so I understand it a bit, but a lot of VFX is kind of dark arts, which causes clients to come to visual effects companies and see VFX as magic, because no one understands what they do. The downside to that is that they can ask for things or approach scenarios in such a way that is really back-to-front, and doesn’t produce the best result. I find that storyboarding shots is really useful, but at a certain point it becomes somewhat limiting, because you’re having to invent every single detail about that shot. Whereis, in the real world, what you tend to do is you have a space, because it already exists. The light hits objects in this space a certain way, and going in, you knew you’d do a close up of someone’s face, but if you were to have them look down a little bit, and maybe move to the right, suddenly you have this beautiful composition that you wouldn’t have found with storyboarding. The trick with VFX is having that opportunity, and going, “this was the plan, but now that the ingredients are here in front of us, doing this would actually be better.” So figuring out a seamless way to do that without it being painful for the artists is important. There’s lots of ways to achieve that, but when you’re in space with spaceships, the only real way to do it—unless you’re doing what George did, which was taking footage of WWII aerial combat that would represent the final shot—is what John Knoll and Industrial Light & Magic were pushing for.
John Knoll and Alan Tudyk, in his mocap costume, on the set of Rogue One: A Star Wars Story.
And what was John pushing for? It was pre-viz animation of each section of the battle sequence. They then figured out a set up where they had an Apple iPad, with a game controller attached to it. When you moved the iPad, it could tell where it was in 3D space. They would then just loop these twenty or thirty-second chunks of animation, and I would get to hang out in these spots and just film it again and again, generating hours of footage. Then I’d go home on my MacBook and select my favorite takes, and then try to cut something together. It would be very jittery, and handheld, and not perfect. For each one we’d smooth it out by filming from another spaceship, and for another we’d keep some of that handheld-look. It felt like the process of getting those virtual shots was how we were getting the live action shots, which was, “light a space and find the shot,” versus, “tell us the shot, and we’ll invent all the pieces to create it.” It always feels more real with the first approach.
The Death Star’s Mk I Superlaser is set into place.
Like that shot of the Star Destroyer emerging from the shadow of the Death Star. Exactly. That was a great example of John Knoll and ILM pushing for this new technology. If I remember correctly, we needed a shot of the Death Star for the trailer, but they needed it in only a few days. I remember that John got the iPad, and set up a model of the Death Star and some Star Destroyers. The important thing when devising a shot like that, is that the idea of scale is only relevant to something bigger in the shot. A typical thing you do in matte paintings, is when something needs to look really big, you paint a little human in there. It’s a trick that they used to great effect in The Empires Strikes Back. When you want something to feel big, you need to set something up to feel really big, and then show a new thing that’s even bigger than that. The idea was to have a ship that you know the scale of, like a TIE fighter, and then reveal the Star Destroyer, which feels huge, and then you reveal the Death Star which feels impossibly massive. I remember asking them, “can you do real-time shadows on this?” Once I learned that that was possible, it became so fun to reveal and conceal the ships in shadow, and find that moment where the dish slides into place. Within a few hours, we had the shots that went into the trailer, and that never would have been possible without the real-time technology that ILM was using.
An Imperial I-class Star Destroyer emerges from the shadow of the Death Star’s Mk I Superlaser.
Since Rogue One, you’ve gotten to visit ILM’s StageCraft volume in person. Having that experience, did it make you think about how you may have captured any shots differently? I think that’s always true of technological advances in filmmaking, so yes. For sure. It feels like filmmaking in general is an archaic process. It’s over a century old, and in some ways, it hasn’t changed hardly at all – and yet this digital revolution, which is happening all around us, should drastically change the ways we make movies, but it’s been a slow process. There is so much we could do to utilize the technology we have to be more creative, and allow us to do things we couldn’t have done before. StageCraft though is a massive leap. It’s game-changing. It’s moving the industry forward.
ILM’s StageCraft in use on The Mandalorian Season Two.
What do you think will stay the same? Storytelling, regardless of the medium. From the time of early humans, a million years ago until today, we have an innate need to sit around a campfire and listen to a story. Whether it’s a story about something interesting that happened that day, theorizing about why the world is the way it is. It’s absolutely hardwired into us. There’s this little glowing light. The need to paint a picture or sing a song about another person or another place. I don’t believe that that is going away anytime soon. We’ll always have an appetite for storytelling. Strangely, what I find funny is that a movie is around two hours long, and that’s about the length of time it takes for a campfire to burn out. It’s so embedded in us. That’s what cinema is, it’s us being able to dream out loud, or watch another person’s dream in real-time. I hope that stays with us. I hope in five-hundred years, people are still watching Star Wars.
The DS-1 Orbital Battle Station prepares to fire on Jedha City.
In getting into the making of film and working with Lucasfilm and Industrial Light & Magic, were there ever moments where you needed to pinch yourself, because you were given the ability to create nearly anything you could dream up? Funny enough, it almost felt like we had too much power, so we needed to be careful that we limit ourselves so it felt like the original trilogy. One of the early things that we did with John Knoll and ILM was the kitbashing, using the original models just like the original films. We went and bought some of the original model kits, lots of WWII, vintage-collector stuff. We started scanning those model parts in so that we could stick them onto the models we were building. What’s interesting is the subsurface scattering that goes on with those model pieces. It’s not like metal. So we were trying to recreate that model kit feeling on our ships. What’s funny though, when you get into that scenario, you realize that your memory is a little bit better than reality with some of the sets and props. The golden rule became, “let’s not do it how it was, let’s do it how we remember.” We wanted it to look and feel like how you “thought” those models looked. Also, going through the Lucasfilm Archives, the models are everywhere, and there were some designs in there that looked really cool. Those were good keys for us, because we were making something before A New Hope, but what did that mean, stylistically? Doug Chiang had a really hard task before all of us while doing Star Wars: Episode I – The Phantom Menace. They took a real leap on that film from both a timeline and stylistic standpoint. It was much more Art Deco. The streamlined nature of the ships felt much more like The Rocketeer and Flash Gordon; those types of serials that inspired George Lucas to make Star Wars in the first place. And that makes sense because it takes place over 30 years before A New Hope. But for us, aesthetically, it was just before A New Hope, yet we still wanted it to feel distinct. What we did for inspiration, is we looked at the Ralph McQuarrie artwork, and the early models that had been made for A New Hope; things that had been either abandoned or improved. We sort of “reversed the car” back into that space, and where they went left, we went right. One example is that really slender, aspirational Stormtrooper that Ralph painted. He made that without a care in the world about how it would be actually realized by the costume department. We kept pointing at that, because traditionally when you put someone in armor, it can start to feel a bit bulky. We wanted something that looks like it could sprint and cause some serious damage. We tried to make armor that was slightly bendable, so it could sit just over top of the skin. We tried to cast towards that look of the troopers, someone tall and lanky. What that eventually became was the Death Trooper. That’s where the Death Troopers started. To answer your original question, we tried not to be kids in a candy store, we tried to temper that and work off the design language that existed. “With all of those limitations, what would they have done?”
The UT-60D U-wing, ‘LMTR-20’, heads for Eadu during Operation Fracture.
Did you and Greig Fraser try to match some of the old-school camera moves seen in the original trilogy? We did. We tried to keep the vocabulary of any shots featuring Krennic and the Empire the same as those old school cinematographers. The way when someone walks by, how they would push in again to recompose based on the new position of the actors. All of these little things that would happen that were common in the late 1970s, things we don’t do so much now.
Darth Vader (voiced by James Earl Jones) from a scene in the trailer for Rogue One: A Star Wars Story.
This year being the 50th anniversary of Lucasfilm, I wanted to know what George Lucas means to you? A lot of things come to mind. As crazy as it sounds, I think he’s underrated. I know that sounds crazy with all of his accolades. When I was a little kid, I didn’t ever really watch THX 1138 or American Graffiti. But as I got older, I would revisit those films constantly. THX is an incredible debut. It’s just an absolutely fantastic film, and one of the strongest films that came out of a first-time filmmaker in all of cinema. In terms of how bold and completely brand new it was. So many things made their way to Star Wars too. That fuzzy comms chatter. The clinical corridors. The look and feel. There was no Ralph McQuarrie, but it felt so much like George. He has such a great aesthetic and an amazing eye. It took a lot for him to make that film. Then he goes and makes the films that have inspired you, and me, and everyone like us. Even if that was all he ever did, it would have been enough. But then he goes and pushes harder, and pushes the digital technology further. Because of him, I was able to make my first film on a digital camera. I wouldn’t have been able to make my first film if it required film stock. I wouldn’t have been able to make my first film without George. Him pushing HD, and all the work he did with the technology used for the prequels, and the digital camera technology. I got into digital effects because of that, and I wouldn’t have been able to if it wasn’t for George building up everything at Industrial Light & Magic. He inspired me to want to become a filmmaker, and he gave me the tools to do it. At the end of that journey, I got to make a Star Wars film. He gave me Rogue One.
George Lucas on the set of Star Wars: Episode IV – A New Hope filming a scene aboard the ‘Tantive IV’.
Have you gotten to spend time with him? A handful of times. This next story I say with the utmost love and admiration for the entirety of the Star Wars catalogue. But when I had my office at Pinewood, I was putting a lot of pressure on myself to make this film. Everyone had The Empire Strikes Back, and A New Hope posters in their offices. To help elevate that, and to remind myself that this was Star Wars, and that I was making a Star Wars spinoff film, and I needed to have fun, I put up framed prints from The Holiday Special and Caravan of Courage: An Ewok Adventure. Those were the first spinoff films. [laughs]. Well, one day, George came to Pinewood, and he was sweet enough to come up to my office. I worked really hard to distract him while we spoke so he wouldn’t see the posters. I was really animated, and tried to lead him through the back. It was like a comedy skit as I tried to keep him away from the posters. I didn’t want him to get the wrong idea about Rogue One. [laughs]. We got to hang out for a few hours that day, and I got to tour him around. I got to spend time with my hero. It was a surreal experience. He’s the Paul McCartney of film.
Gareth Edwards on location for Rogue One: A Star Wars Story. Photo courtesy of Greig Fraser. All Rights Reserved.
And you got to spend time at Skywalker Ranch, yeah? Yes. When I was there, the projectionist was so sweet. They said, “ when we would project the Star Wars reels, George would sit right there.” And they pointed over at a seat. “Would you like to sit there?” I got to sit there throughout the sound mix on Rogue One. It felt like I was sitting on the throne of the film world. The funny thing is, if you’re so intimidated by it, it can paralyze you. You have to let that fall away. But let me tell you, that was the best job in the world. That beautiful drive through the trees and hills on the way to Skywalker Ranch. Past Lake Ewok. It was so utopian. We were making a Star Wars movie. It was everything I’ve ever dreamed of. It’s surreal to think it even happened to begin with. You dream about this stuff as a kid, but it shouldn’t actually happen. What’s funny is, when it comes to Industrial Light & Magic, and Lucasfilm, and the team at Skywalker Sound, you see it in everyone that works there. We all have the same story. You and me, we grew up with the same story. The trinkets on your desk are the same ones I have at home. Those Ralph McQuarrie prints behind you. I feel like we all have a lot in common. I feel like if I was going to hang out with people outside of work, it would be with the people at ILM. Everyone is a mini-filmmaker, and even though we grew up in different places all around the world, if we went to the same school as kids, we’d be mates – and then suddenly all of these people wound up at Skywalker Ranch & Industrial Light & Magic. When Covid is through, I hope everyone can come together and see each other again.
Gareth Edwards on location for Rogue One: A Star Wars Story. Photo courtesy of Greig Fraser. All Rights Reserved.
I love that. Last question. John Knoll and Hal Hickel wanted me to ask you about Area 51? [Laughs]. So I was in Las Vegas watching John Knoll, Hal Hickel, and Matthew Wood, from Skywalker Sound, during a panel at NAB for Star Wars. After it was through, I told them all, “we are only a few hours away from Area 51. We will never get this chance again.” [Laughs]. We drove several hours in the dead of night, through Rachel, Nevada, and walked right up to that fence where you couldn’t go any further. We went to Area 51. We stayed just long enough to scare ourselves, and then we got out of there.” [Laughs].
Gareth Edwards on location for Rogue One: A Star Wars Story. Photo courtesy of Greig Fraser. All Rights Reserved.
Greig Fraser on the set of Rogue One: A Star Wars Story. All Rights Reserved.
The cinematographer for Denis Villeneuve’s Dune, and Matt Reeve’s The Batman, joins Industrial Light & Magic’s Publicity Group to discuss his work on Rogue One: A Star Wars Story. Greig shares how the early Kenner action figures inspired his love of Star Wars, and the influences he found in 1970s cinema, the works of Andrei Tarkovsky, and the film The French Connection.
What was your introduction to Star Wars? If I think back about how I was first introduced to Star Wars, I think it had to be through the toys. I genuinely think it was the toys that got me going there. I was two years old when Star Wars came out, and five when The Empire Strikes Back premiered. You couldn’t really call me a “film fan” at that point, but the franchise definitely existed in my universe. I read some of the comics later on, but the thing I loved the most back then were the toys. A few years after, I think ‘82, Star Wars came to Betamax and VHS, and then the year after that, in 1983, I finally saw Return of the Jedi in theaters. It was mind-blowing, because the visual effects that ILM did for it were so revolutionary and groundbreaking. Then over the course of the next ten or fifteen years, I think I watched A New Hope, The Empire Strikes Back, and Return of the Jedi literally hundreds of times.
A selection of Star Wars Kenner action figures available in the early 1980s.
How did the experience of watching the original trilogy influence your work on Rogue One? The funny thing is, when it comes to Star Wars, there is a very particular visual language with the way the films are made. From the way they climb aboard the Millennium Falcon, to the wide shots of the Millennium Falcon going past the camera. There is a visual language that exists that, unless you’re studying it, you don’t really notice it. That occurred to me when we started Rogue One, when Gareth basically told me, “we’re not remaking Star Wars. We’ll make this movie the way we would want to make this movie.” But the thing is, what was great about that, is that we could channel Star Wars. Normally you try to hide your influences; you don’t wear them on your sleeve when you make a movie. You try to become a little more nuanced, a little more “clever” about sort of fooling people into what your influences are. “No, I didn’t actually watch Steven Spielberg films to make this ‘Spielbergian’ movie.” Those sorts of things. But what was great about Rogue One is that we were making a film that actually connected directly into Star Wars: Episode IV – A New Hope, by design. So if we wanted to reference anything from Episode IV, Episode V, or Episode VI, we could. We were actively encouraging ourselves to do it. For me that was a huge revelation, because normally, on any other film, you wouldn’t do that. For example, when we went back and watched Obi-Wan’s sequences aboard the Death Star, we would study how Sir Alec Guinness would move throughout the corridors, and it was very influential in the way that we did some of our movement through the Imperial security complex on Scarif. We took for granted that it was such a big place, and that the Imperials would be minding their own business and doing their own thing, and that you could have these Rebel spies, and have them actively infiltrate this heavily-fortified complex.
Obi-Wan Kenobi uses a Jedi trick to distract a pair of TK Stormtroopers aboard the DS-1 Orbital Battle Station.
Was there a lot of conversations around trying to match the aesthetic of A New Hope? There was. Growing up, you got used to watching Star Wars on Betamax and VHS, on a home television format. For research for this film, I was able to watch a 4K scan of one of the earlier films, and the conversation turned to, “is that our North Star? Do we make it look exactly like that? Do we shoot it on film, with those same lenses?” Sometimes your memory of something can be slightly different from reality, so what we did for Rogue One, is we tried to match it to the aesthetic of our “mind’s eye”, and what we remember from Star Wars growing up. For us, thinking about that look – it wasn’t super sharp, but it had depth and clarity. It was soft at times, but not defunct. That is why we chose the format that we did, the ARRI ALEXA 65, paired with these old lenses. For Gareth and I, it felt like it was showing us the film that we remembered as kids.
Director Krennic is confronted by Darth Vader at Fortress Vader on Mustafar.
Did you find other advantages to shooting digital? Was there ever a conversation of shooting it on film? There were a number of factors. The look we were trying to achieve was one factor, but the other thing that we had to balance towards was the fact that Gareth Edwards is a very hands-on filmmaker. He loves to operate the camera. Watch his film Monsters, which, coincidentally, was the whole reason I wanted to meet Gareth in the first place. When I was called up to do the interview for Rogue One—and of course, I was so excited for the opportunity—I thought, “even if I don’t get this job, I will get to meet the guy that made Monsters. I’ll get to shake his hand, and I’ll get to tell him about the mad respect I have for him and his film.” So when he explained to me that he wanted to make Rogue One with the same spirit that he used to make Monsters, I got really excited. That decision was also part of the reason we chose the ALEXA 65. It had all the film qualities of a much bigger camera, but it was in this bitesize package that you could throw around, and put in cockpits, without having to destroy too many things to get the shot you needed. It was a series of factors, but it all worked in our favor.
A shot from Gareth Edwards’ film, MONSTERS. Photo courtesy of Magnet.
Gareth has a unique style of shooting, where he’ll go from one take to the next without slating. How did your style integrate with that? I found it very exciting. In some ways, even though Gareth was my director, he was also my camera operator. I loved helping him build a world where he could achieve anything that he wanted to achieve; be that handheld shots, or very specific tracking shots. That’s what I loved about Rogue One, and how Gareth wanted to make it. There were considerations, of course, but there were moments of freedom – both in freedom of movement, and freedom of camera. It kept everyone on their toes. He would pick up these small moments, maybe something an actor was doing, and he would get the camera in there and capture it.
Gareth Edwards shoots a scene of Jyn Erso (Felicity Jones) on the set of Rogue One: A Star Wars Story.
Greig, your photography has such a distinct style. What influences did you pull from in designing the palette of Rogue One? I’m a big fan of world cinema, and I’m a big fan of ‘70s cinema. I love Andrei Tarkovsky. I think the way that he makes movies is so beautiful, and so strong. But I also love the way that Kathryn Bigelow shoots her films. I love The French Connection, and the way that it was shot. For Rogue One, we mined the depths of our interests, and the types of films that we loved to watch. Lawrence of Arabia was another influence. These massive, David Lean-style battles. These big frames, and tracking shots, and static shots. Then you combine that with modern-day filmmaking, which, if you look at the evolution of cameras, has changed drastically. Back in the 1950s and ‘60s, the cameras were much larger than they are today, and harder to move around. Therefore, films looked a certain way. When you get into the 1970s, when George Lucas was shooting Star Wars, there was not a lot of handheld in that film either. The cameras were not really malleable, and, stylistically, that wasn’t really what he was after anyway. What was good for us though is that we were able to combine our interests and influences. Gareth and I clearly love Star Wars, but that is not the only thing we’re influenced by. French cinema, documentaries, all of that played a part for us.
An image of Baz Malbus (Jiang Wen). Photo courtesy of Greig Fraser. All Rights Reserved.
Tell me about the early conversations around virtual production and LED walls on Rogue One, and how that got us to today with ILM’s StageCraft? This is where having amazing partners, like Industrial Light & Magic and John Knoll, was very integral. What we were pitching was not a common thing. Emmanuel “Chivo” Lubezki had played around with something similar on the film Gravity, with putting actors in an LED box, but we were talking about putting people into ships and big environments. It all stemmed from a lighting problem, and the problem goes like this: “you’ve got somebody in an X-wing above a planet. We’ll use Earth as our stand-in for Scarif. You’ve got a sun source, you’ve got ambient light bounce from Earth, and then you have black space. When you’re in the atmosphere, you have all of this beautiful light coming from above, and below, and from your sun source. That type of scenario is really easy to light. But what happens when you’ve got no ambience above, some ambience below, and then a sun source? Now, imagine those lighting conditions, and pretend you’re in the cockpit of that X-wing, and you do a barrel roll. As you spin around, it’ll transition from light to shadow on your face and around the cockpit. To try and do that in a studio environment, with the lighting we have, is very difficult. You have to put diffusion on all sides to make it nice and soft, so when you sequence the lights over the top, you get the illusion of camera and lighting movement. But what happens when you push light through the diffusion? It bounces back from the other side. With that said, I needed a black side and a light side, but then, of course, that wouldn’t have worked for the barrel rolls, because the light would have needed to move. The one thing we had at the time that could account for all of this were LED screens. When the light turns off on an LED screen, it’s pitch black. It’s the perfect lighting tool for that type of thing. That then progressed into the next question, “if we’re going to use that tool, for that one instance, can it work for other scenarios? Like flying across Jedha, or soaring through the atmosphere of Scarif?” That’s where this tool, this LED volume, became immensely helpful. People like John Knoll, and the people at ILM, are extremely integral to getting the quality right for something like this. Good VFX can live or die by bad lighting. That’s why ILM’s StageCraft is such a powerful tool for DP’s. Because DP’s know, if you can get the lighting right, you’re halfway there to getting a good final image.
The partial hull of a T-65B X-wing starfighter used for shooting on the set of Rogue One: A Star Wars Story.
That must have been exciting to figure out? It was such a great project, because it really upheld the vision that George Lucas had for the future of filmmaking, the “stage of the future”. George theorized that, years down the road, there might come a time when a filmmaker could walk onto a stage, and they could project whatever they wanted up onto the walls, or that those walls could have color-changeable light. They wouldn’t have to light for it, they’d only need to flick a switch. That was the hopeful future that George was thinking about, and now, years later, ILM made that a reality with StageCraft. Filmmakers now have the ability to put any high fidelity, real-time image up on the LED volume. Rogue One was the proof-of-concept for lighting, and that evolved into what ILM, John Favreau, and the Lucasfilm team are doing on The Mandalorian, along with so many other exciting projects.
An early LED volume used on the set of Rogue One: A Star Wars Story.
George referenced a lot of things for his aerial combat, including old WWII gun camera footage. How did you approach the ships flying in Rogue One? While we were shooting, it became obvious where the camera could be, and where it couldn’t be. In Star Wars, there were never any mid-shots of people sitting in cockpits. You don’t have Han Solo in a mid-shot, shooting from outside of the cockpit. You never had a camera floating in space for a shot like that. The camera was always fixed inside the cockpit, or super-wide. There was no in-between. It would never go from a super-wide, into a mid-shot, into a closeup. The only example of that might be the final shot of the Millennium Falcon, just before Lando departs the Medical Frigate, at the end of The Empire Strikes Back. With that said though, we tried to maintain those parameters for Rogue One, and we didn’t want the audiences to have to think about it. I haven’t spoken to George Lucas about it personally, and maybe if he would have had infinite resources he might have shot it differently, but we wanted our film to match A New Hope, and we loved the look. It built our visual understanding of what a Star Wars film should be.
Jon Vander’s “Gold Squadron” forms up as they prepare for their assault on the Shield Gate during the Battle of Scarif.
There’s something intimate about it. When I think about old WWII air combat movies, they did the same thing. Exactly. And they were forced to shoot like that. You either had a camera in the cockpit, or a camera on another plane. You couldn’t get a plane in close enough to get a reaction from a pilot, or you’d have planes crashing into each other. It was either super-wide, or close. It was purely pragmatic.
Red Twelve (Richard Glover) participates in the Battle of Scarif.
You did have a unique shot that was used a few times that I loved, and that was the one of the camera fixed on the X-wings and Y-wings, directly behind the astromech droid. Gareth was clever, because even though we had these rules on how we would shoot the ships, we would work off moments from the earlier films to devise new things. There’s that shot of R2-D2 getting blown up in A New Hope by Vader in the Death Star’s meridian trench, and this was kind of an evolution of that shot, while still keeping one foot planted in that A New Hope aesthetic.
A T-65C-A2 X-wing starfighter drops out of lightspeed at the Battle of Scarif.
How did it feel with The Force Awakens shooting alongside your film, and to a degree, The Last Jedi too, when you were shooting pickups? It was fun. We were all sharing buildings and in each other’s worlds. I’m such a big fan of Star Wars, and I could have walked on set and spoiled everything for myself, but I chose not to. I just wanted to enjoy them as a fan. I did have one thing spoiled for me… someone walked up and told me the scene regarding Han Solo, and my first reaction was, “how dare you do that to me! I wanted to see that in theaters!” [laughs]. We shared some crew from time to time, but we generally had blinders on for Rogue One. While they were making their films in the Skywalker Saga, decades in the future, we were leading right into A New Hope, so ours was almost the equivalent of a period film, in our language. I found that to be very exciting.
Greig Fraser on the set of Rogue One: A Star Wars Story.
What’s your favorite shot, moment, or sequence in the film? One of my favorites is that wide tracking shot of Jyn Erso (Felicity Jones) making her way through the Massassi outpost on Yavin 4 after she’s “rescued” from the Wobani Labor Camp. I also love the final sequence with Vader aboard the ‘Tantive IV’. When Gareth rang me to tell me we were going to do that, I was ecstatic. It’s such a wonderful sequence. We had the time to prepare it properly. We had the time to rehearse all the action, and to do the lighting tests. We also spent a lot of time figuring out how best to light Vader. As a kid in a grown man’s body, that blew me away. Vader, this dark “shape”, terrified us as kids. It was a dream come true to add to his iconography. I felt very honored and very blessed. Another moment I loved was seeing the full-sized X-wing props in person for the first time. I was transported back to being a kid again, playing with my toy X-wings, but then, of course, my filmmaker brain would kick on, and let me tell you, moving full-sized X-wings around on a set is pretty difficult [laughs].
Vader ignites his lightsaber in an attempt to capture the stolen plans to the Death Star aboard Admiral Raddus’ star cruiser.
I love the sequence you shot in Iceland of Orson Krennic and the Death Troopers making the long trek up to the Erso homestead from the shuttle. His cape flapping in the wind, it was incredible. I love that shot too. An interesting thing about that sequence is how we found that location. In that part of Iceland, there’s all of this black sand, so they plant this weed to prevent it from blowing onto the roads and destroying the cars. It’s basically useless outside of keeping the sand from blowing about. We found that location on Google Earth while we were driving around, location scouting. I thought it looked so unusual and interesting. As soon as we dropped the moisture vaporators in, those weeds started looking like crops that the Erso’s were farming, and it instantly became Star Wars.
Director Krennic and his personal attachment of Death Troopers storm the Erso homestead on Lah’mu.
Hal Hickel climbs aboard a T-65C-A2 X-wing starfighter on the set of Rogue One: A Star Wars Story.
Hal Hickelspeaks with ILM.com to discuss the process of bringing to life a reprogrammed KX-series security droid, along with his work on the pivotal space battle featured in Rogue One: A Star Wars Story.
Tell me how K-2SO came to be? It was a whole journey. When we started the project, even before Alan Tudyk was cast, Gareth was very keen to explore the idea of a droid with an expressive face. Strictly speaking, that would generally be outside of the style book of Star Wars. Droids in Star Wars typically are industrial, with very simplistic designs. Even ones that are anthropomorphic, like Threepio, don’t have moving face-parts. Even their eyes don’t move. That’s just Star Wars. But we thought, “hey, this is interesting. Let’s look into this.” We did some tests and things. Part of the problem though is that if you’re going to go down the road of an expressive metal droid face, you’re dealing with hard-surface pieces, not rubber skin. You really have to get quite detailed before you can get into something that can express emotion. You have to have, at minimum, eyebrows that can make expressions. A mouth that can do the same. There’s more than that, but those are the basic things. You also can’t just go in a little bit, you have to go quite a ways down the road. With that said, some of those movements can start to feel overly complex, and not very Star Wars.
Jyn Erso (Felicity Jones), and K-2SO (Alan Tudyk), during a scene together in Rogue One: A Star Wars Story.
When was Alan Tudyk brought in? He came on during that time period when we were figuring out K-2’s facial movements. I took an audio clip of Alan from his web series, Con Man, and did a quick animation with just the minimum of some eye blinks and eye rotations. It looked good, and it became a talking point, so I traveled out to the UK where shooting was being prepped. Alan was there, and he was getting fitted with the K-2SO stilts by the Creature Shop. We had a production discussion about the animation, and we felt that the blinks pushed it a little too far into “cartoon animation” territory, where you had expressive elements that don’t have an otherwise mechanically practical reason for being there. You could argue that there would be little shudders to protect the eyes from dust, but the idea of them blinking from an expressive standpoint pushed it into a different realm. What we did love though was that the eyes could rotate. While it didn’t communicate emotion, it did communicate a cognitive function. The character is thinking and their eyes are kind of darting around in a quiet moment, and you can see their wheels turning. So we kept that. Again, it went from, “let’s make him really expressive”, and then sort of going the other way with it and saying, “well, let’s move him back towards a Star Wars aesthetic.” We did really want to have a face part though that the protocol droids didn’t have. There was a droid, EV-9D9, who was Jabba the Hutt’s chief of cyborg operations at his palace. She also had an appearance in The Mandalorian Season One and Season Two as a bartender at Chalmun’s Spaceport Cantina in Mos Eisley on Tatooine. She had a little flap for a lower jaw that would move up and down. We tried that on K-2. He had a little block that was part of his design, and we tried animating that. It would basically click open when he was talking, and then click closed again when he was done. But again, it didn’t make him expressive, and it didn’t add anything to the performance. We knew it was K-2 speaking, we’d recognize his voice, so we did away with that.
Cassian Andor (Diego Luna), Jyn Erso (Felicity Jones), and Alan Tudyk in his K-2SO mocap costume.
We ended up bringing Alan to Industrial Light & Magic, and we spent several hours with him on our motion capture stage. It was the first time he got to wear and walk around in the stilts. It was also where we did a real-time retarget to a simplified version of the asset. Alan could see himself on the screens, and it was a bit like an actor trying on a costume, or looking at themselves in a mirror and figuring out how to carry themselves. He was able to spend a lot of time figuring out how robotic to act versus how natural. It was super useful because he could stow that experience away. He wouldn’t be able to see that on set, or on location, but what he could do is build that exercise into the performance; what the character is for him, just by doing it for a few hours on our mocap stage. Two weeks later we were shooting in Jordan for his first scene, and he was able to tap directly into that.
Alan Tudyk, standing on stilts in his K-2SO mocap costume, chats with Cassian Andor (Diego Luna).
I also understand that you had two shots of K-2SO in the film that were rendered in real-time, is that correct? That’s correct, we did. We shot two scenes of K-2 from behind, because at the time, we had yet to find a solution for his luminous eyes, and the refractions, and those sorts of things, but with a little more time we could have sorted that out. Those two shots though were rendered in real-time. There was no demand from the film that we do that, but instead we did it because it was technology that we really wanted to push forward. With the convergence of games and offline traditional visual effects, we knew we wanted to keep pushing into that space, so we worked really hard to do that in a few shots.
Alan Tudyk sits in the practical cockpit of the UT-60D U-wing, ‘LMTR-20’, at Eadu during Operation Fracture.The empty plate photography of the U-wing cockpit with the background comped in.The final shot with K-2SO included, and atmospheric effects comped in.
Tell me more about your work on the Battle of Scarif? That was a great experience because I got a rare opportunity to contribute to the story. Working in visual effects, we are involved in projects from beginning to end, but we are mostly thought of from a post-production standpoint. John Knoll is an outlier on Rogue One because he conceived the story, but generally, we aren’t involved in the story aspect. Some yes, (it depends on the project), but quite often, no. What happened though is that John and I started to have these story meetings with Kiri Hart, Pablo Hidalgo, and Dave Filoni, to figure out what would be happening in this battle. “What are the stakes? Who’s doing what? What is Admiral Raddus up to? How is he communicating with the Rebels planet-side?” That was super fun for me, because as I said, not every project affords me with the ability to contribute at the story-level. Out of that, John had a concern about the specifics of the battle that were outlined in the opening crawl of Star Wars: Episode IV – A New Hope. It says “Rebel spaceships, striking from a hidden base, have won their first victory against the evil Galactic Empire.” We started to ask, “What was that victory? Is it just getting the plans, or was it a large-scale military victory as well?” We felt that we needed to make good by that line.
The Alliance High Command, led by Chancellor Mon Mothma (Genevieve O’Reilly), discuss the threat of the Empire.The Alliance Fleet drops out of lightspeed above Scarif.Baze Malbus (Jiang Wen) fires his HH-12 rocket launcher at an approaching All Terrain Armored Cargo Transport on Scarif.General Antoc Merrick and Blue Squadron come to the aid of Rogue One at the Empire’s security complex on Scarif.
So how did you do it? Well, we felt there needed to be a moment in that battle where the Rebel Alliance lands a significant blow against Commander Cassio Tagge’s starfleet. There’s that great moment from the Death Star conference room scene in A New Hope, where Admiral Motti reminds Tagge that the Alliance is “Dangerous to your starfleet, Commander. Not to this battle station.” We wanted to honor that too. John came up with this great idea of colliding a pair of Star Destroyers into the Shield Gate and knocking the planetary defense shield out. So if you think about it, the Rebels took down two of the fleet’s prized Star Destroyers, destroyed their shield generator and space station, forced Tarkin to obliterate his own security complex on Scarif, and they also managed to steal “Stardust”, the technician readouts to the Death Star. We felt that that was a satisfying victory for the Alliance.
General Cassio Tagge (Don Henderson) warns Admiral Conan Antonio Motti during Star Wars: Episode IV – A New Hope.
How did you realize the sequence of the Star Destroyers, ‘Persecutor’ and ‘Intimidator’, colliding? I handed that off to a great Animator by the name of Euising Lee. He’s a terrific artist, and he’s especially gifted in his ability to realize spaceship action, and also camera work. He was able to take the idea and just run with it. He made a mini feature out of it; just a huge meal of all these shots. We showed it to Gareth, and kind of boiled it down into the shots that we wanted and made a shorter version of it. But he really took John’s idea and pushed it forward visually into a really terrific series of shots. That was a really fun process. The whole space battle was tough, because, as John previously mentioned, there were a lot of moving parts. “When are we with the action on Scarif? When are we with Raddus? When are we with the fleet in battle?” It was tough because you can’t kick out little iterations of shots. It’s laborious to reanimate an entire chunk of space battle to see how it plays. So we tried to repurpose old pre-vis that sort of fit the bill, and then we’d quickly do temporary versions of shots to fill holes and give editorial something to plug in to figure out when we were going to be and where, and more importantly, what the goals were. The whole thing with Admiral Raddus waiting for the transmission from the squad on Scarif seems obvious, but there were versions of the space battle that didn’t play out like that, where instead he was directing the fleet and just trying not to get blown up. A lot of this gelled out of story meetings, and the work that Gareth Edwards was doing, and Tony Gilroy too. The battle took awhile to come together, but the center piece – the great moment where it all goes silent when the ships are falling, and they puncture the Shield Gate. It turned out perfectly.
‘Persecutor’, an Imperial I-class Star Destroyer is pushed into ‘Intimidator’ by a Sphyrna-class Hammerhead corvette.
How did it feel to finally see that sequence? The thing is, when working, sometimes late in post we might get to hear music that’s being developed, but more often than not we just hear the scratch sound design, and we don’t hear the music until it’s in cinemas. Let me tell you, sitting in the theater at the premier, and that moment in the scene when it goes quiet, and Michael Giacchino’s score swells, and the Star Destroyers are plunging down into the Shield Gate, I just thought, “my god! What a moment.”
Hal Hickel holds a prop of the data-tape containing “Stardust”, the codename for the technical readouts to the DS-1.
John Knoll in the cockpit of a crashed Partisan X-wing fighter on Jedha from Rogue One: A Star Wars Story.
John Knoll, Executive Creative Director at ILM, and the Senior Visual Effects Supervisor on Rogue One: A Story Wars Story sits down with ILM.com to discuss the film’s five-year anniversary.
John, the whole idea of Rogue One started with you. How long back had you been thinking of this idea before it was greenlit? I started thinking about this all the way back on Star Wars: Episode III – Revenge of the Sith. I was on set when we were shooting in Sydney, and I think we were waiting for some set-up to happen. I started chatting to Rick McCallum who was producing the film, and he mentioned that he and George Lucas were developing a Star Wars live-action TV series, and that they were working on scripts. I started thinking about all of the interesting tales you could tell in a show like that, and one of the first things that popped into my head was, “what about a Mission: Impossible-style operation to break into the most secure facility that the Galactic Empire had to steal the plans for the Death Star?” I started toying with that idea, along with a few others, and I approached Rick again to learn more about the time period they wanted to set the show in, and I realized that none of my ideas would apply to that period, so I shelved it.
John Knoll works on a miniature of Sheev Palpatine’s private viewing box in Star Wars: Episode III – Revenge of the Sith.
When did it pick back up again? Well, flash-forward to 2012 after Lucasfilm’s acquisition by The Walt Disney Company, where George selected Kathleen Kennedy to lead Lucasfilm, and our announcement of the continuation of the Skywalker Saga. What we also announced then that I was really intrigued by were the spinoff films. The first one we announced internally was Solo, and I got so excited about where these spinoff films could go, because the possibilities were endless. As a bit of a joke, I started pitching an updated version of my story that went, “picture a SEAL Team Six in the Star Wars universe, and they’re going on this desperate, high-stakes mission to break into the most secure facility in the Galactic Empire to steal the plans for the Death Star. What about that?” People would go “oh… actually, that sounds pretty cool…” [Laughs].
Early concept art from Rogue One: A Star Wars Story.
You can’t not love that pitch! [Laughs] So I started having this conversation with a number of people, and every time I pitched it, I would start to add more detail, and it would get bigger and bigger. Kind of as a mental exercise, I asked myself, “well, if I were serious about this, who are the main characters? What are their arcs? What is the plot structure? How does this start, and how does this end?” I remember a specific moment where I was at this annual charity trivia game that we do, and I was on a team with a couple of friends, and we had about an hour over dinner while we were waiting for it to begin, and Kim Libreri said “tell me about this Star Wars story idea you have?” So I pitched him the half-hour version of it, with every detail, and at the end he goes, “you have got to go pitch this to Kathy.” At that point I realized I had to do this, because if I didn’t I would always wonder what could have been. So I called up Kathy and made an appointment, and I think it took maybe six weeks to find a time to meet with her and Kiri Hart from the Lucasfilm Story Group. I spent those six weeks writing up a really detailed treatment with all of the character descriptions. When the day came, I brought my treatment, sat down with Kathy and Kiri, and just dove into the pitch and the characters. They listened very politely to the whole thing, Kathy told me she was impressed with the story, and that was basically it. I didn’t hear anything for a few days, and at that point I was like, “well, I did it, at least now I don’t have to wonder.” A week later I got a call from Kiri, and she goes, “Kathy and I have been discussing your story a lot, and I think we want to proceed with this.” I was so elated, and one of the crazy things was that the first spinoff was supposed to be Solo, but Larry Kasdan got pulled into the development of The Force Awakens, and out of all of the spinoffs that they were tinkering with, Rogue One got slotted up to take its place in the queue. It was pretty surreal.
The first image released of the cast from Rogue One: A Star Wars Story.
Tell us about the one-off cockpit shots in the film. I understand they were made from foam core, and then the CGI was built up around it? That’s right. I got talking with our Director of Photography, Greig Fraser, and our Production Designer, Neil Lamont, about some of these one-off sets – like the cockpit for Leia’s CR90 Corvette, the ‘Tantive IV’, or the interior of Admiral Raddus’ MC75 Star Cruiser, ‘The Profundity’. We looked at the set budget that we had, and realized that it was tough to justify an extensive build like that for something that would only be on screen for a handful of shots. The cost of entry for a Star Wars movie is expensive, because you can’t shoot a single frame of film without having to build almost everything in front of the camera. With that said, we were under a lot of pressure to trim wherever we could, so those limited-use sets hit the chopping block pretty early on. When asked how we could save money, I suggested that we could likely do them as virtual sets, where we just build a fragment of it where the actors were going to be. Greig Fraser had the same concern I did though, and that was that these types of sets are really hard to light well; not to mention that standing bewildered on a blue screen makes it hard for both the actor and the Director of Photography. Grieg and I came to the conclusion that we could use foam core – just enough to provide something to light, and something for the actors to get their bearings against. We felt it was a good way to go. Greig could get what he needed out of it, I could get what I needed out of it, and the actors could get what they needed out of it. That’s essentially how we did a couple of those sets, but the Art Department could not just make it out of foam core [laughs]. I told them it could be the sloppiest, slapdash thing, but they went ahead and added these nicely beveled corners, and it was all beautifully painted with lots of detail.
The foam core set standing in as the cockpit onboard the ‘Tantive IV’ from Rogue One: A Star Wars Story.A digital recreation of the cockpit.The final shot as seen just before the ‘Tantive IV’ jumps to hyperspace to escape Vader’s Star Destroyer, ‘Devastator’.
In conversations with Gareth Edwards early on, I heard that there were concerns that virtual sets might not look realistic enough. How did you convince him otherwise? I did some fairly elaborate set recreations with some nicely rendered walkthroughs to build up Gareth’s confidence in the technology. Starting off, he wanted to build sets for everything, and that’s all fine on a production, until, of course, you can no longer afford to continue building them. I felt like we had reached a point where we had built a number of really good virtual sets for other projects, so we shouldn’t be so afraid of it. The topic of, “how do you light the actors?” became a talking point between Greig Fraiser and I. We both wondered what might prevent Gareth from embracing this, because he wouldn’t want an actor walking around on a blue stage – and I don’t want that either. So in the context of those conversations around lighting actors in virtual environments, that kind of led to what we did on the Blockade runner and Raddus’ ship. For some scenes that didn’t end up making their way into the film, I modeled the Death Star conference room, made famous in A New Hope, where Vader has his confrontation with Admiral Motti. The renders looked really good. I also modeled the corridor of the Blockade runner. Gareth felt strongly that we should do that one as a practical set, and I agreed, because it would be difficult to light the actors meaningfully because of all of the white balances. Resource-wise, the difference between building the foam core version of it, versus the practical set, was fairly insignificant, so it was hard to make the case that we should do it all virtually.
The foam core bridge of Admiral Raddus’ MC75 Star Cruiser, ’The Profundity’.The final shot as seen during the Battle of Scarif.
Going back to the prequels, did you also build a physical set in the corridor shot of Bail Organa’s ship, the ‘Tantive III’, for Revenge of the Sith? We did, yeah. The Blockade runner corridors are pretty limited spaces with a lot of repeating patterns, so that shot in Revenge of the Sith, for example, wasn’t a budget-buster. But for Rogue One, as soon as you turn the corner and go into the cockpit, you have elaborate instrument panels with screens, and levers, and complex seats, and all of those sorts of things: that’s an expensive set, so it’s more cost-effective to do it digitally.
John Knoll stands in the corridor of the ‘Tantive III’ while working on Star Wars: Episode III – Revenge of the Sith.
Did you create any other test environments for Rogue One? I did make one final one, and that was a Death Star Docking Bay. I was really happy with the way the virtual set turned out, and the dynamic walkthrough. We added comms chatter, the sound of mouse droids, a bunch of small details to make the walkthrough immersive. It was a lot of fun to create. If we were to shoot Rogue One today, this type of environment, given its immense scale, would be the perfect candidate for Industrial Light & Magic’s StageCraft platform.
A shot of ILM’s StageCraft platform today, in use on the set of The MandalorianSeason Two.
Speaking of StageCraft, what technologies were you and the team exploring on Rogue One that acted as a proving ground for what Industrial Light & Magic is doing today? ILM’s LED volumes today certainly have their roots in what we were doing on Rogue One, and that actually came from a collaboration with Greig Fraser. About six months before principal photography, I had a really wonderful private session with him, where it was just him and I—no equipment or stages had been booked yet—and we sat down and I gave him my perspective on the top five obstacles that inevitably come up on tentpole movies.
Grieg Fraser on the set of Rogue One: A Star Wars Story. All Rights Reserved.
I basically said, “sooner or later, someone in the production will want to shoot a daytime exterior scene on a soundstage. Here are the reasons why we need to push back on that.” I showed examples of it being done wrong, and examples of it being done right. I gave him a few other possible obstacles that we talked through, and one of those were scenes that take place in a moving vehicle. I knew that that was going to come up in Rogue One. Usually when we have a vehicle flying through a dynamic lighting environment, one of the commonly used gags is having grips put flags in front of the lights, which I find to be a little lackluster and unconvincing. It was at this point that Greig brought up, “well, what about using LED screens?” I actually had an experience with this a few years prior on Mission: Impossible – Ghost Protocol, where we did our car driving scenes by using the background plates that we shot in Prague that would then be comped out of the windows. We put them on LED screens and had them playing, and that provided the lighting for the scene, and it looked really nice. There was lots of lighting complexity as the car drove by different environments; neon signs, for example. You would then see the light move across the actor’s faces.
The jump to hyperspace on the LED panels used on Rogue One: A Star Wars Story.
At this point, Greig and I then conspired to scale this idea up and build a stage, which was a like a proto version of our current LED volumes, and what we now know as StageCraft. It had the big cylindrical screen, the ceiling piece, and some ring panels to help surround it. The only difference was that we were not driving with real-time content like we do on StageCraft. We did pre-recorded content that was animatic-level CG, but it was photographically accurate, and the ratios were correct. So anytime you were looking over someone’s shoulder, and you saw what was on the screen, we had to replace it in post – but what we got out of it was very nice lighting. A lot of things seem obvious in hindsight, but it wasn’t until I was standing on the stage, seeing the light bounce off the shiny helmets and the cockpits, that I realized how big of a deal this was.
An early look of what would eventually become Industrial Light & Magic’s StageCraft platform.
How did the actors react? It was immensely popular with them. Instead of standing lost in a sea of blue, with someone saying “the bag guys are over there where that white ‘x’ is,” it was now all representative, so they could see it. It was really fun, it got them into character, and we got better results. And then that experience translated to Greig Fraser, when he was helping plan the first season of The Mandalorian with Jon Favreau and the Lucasfilm team, he was able to bring that entire experience over. “Let’s do this super LED volume. Let’s do what we did on Rogue One and then take it to the next level.”
The cylindrical screen used on Rogue One: A Star Wars Story.
What was the genesis behind the Shield Gate situated above the Outer Rim planet of Scarif, and where did the idea come from to crash a pair of Star Destroyers into it? The Shield Gate at the Imperial security complex changed a number of times. In my original treatment it was an Imperial drydock and refitting facility for Star Destroyers, and the Rebellion would have mounted an audacious assault on the facility meant to act as a distraction. As the story developed, Gareth really felt that we needed to have the Shield Gate to prevent the Alliance Fleet from assisting the Rebels that were planet-side. As the rewrites were coming together, and the edit was taking shape, editorial was really focusing on the live-action elements that they were balancing, while the space battle would be further refined in post. They had a lot of placeholders in the edit at this point, “the Rebels arrive, they can’t get through, the action escalates, they take out some Star Destroyers, and then they open a hole in the Shield Gate”. The clock was ticking, and time was running out, so they asked us to mock something up based on the broad beats, so I came up with the idea to have the Rebels call up one of the Hammerhead corvettes to push a pair of Star Destroyers into each other, disabling them, and destroying the Shield Gate in the process. I wanted it to be really unique, so I started looking at footage online of what happens when container ships wait too long to brake and hit the dock, causing millions of tons of mass to just start plowing and plowing. Or when ships scrape into each other; just this kinetic energy. So scaling that up to a ship that is supposed to be a mile long, we asked, “how would that work?” If they were to push into each other, and you start one going, just pouring a bunch of energy and mass into that momentum. I wanted it to be all about mechanical damage; not just fireballs. Our Animation Supervisor, Hal Hickel, and his team, took that and ran with it, and what resulted was something that was really visually spectacular.
A layout shot of the Imperial I-class Star Destroyers crashing into each other.An animation shot as the team at ILM plans out the collision.A progression shot as the sequence is further refined.The final shot in the film.
Is it true that the escape pods are visibly jettisoned during the shot of the Hammerhead corvette plummeting into the shield gate as it’s embedded in the Star Destroyer? [Laughs] That is true. They survived! In my head canon, the crew survived. We put lifeboats on the Hammerhead in a pretty prominent way, and at one point we did have a shot of the escape pods jettisoning, but it became a bit distracting for what the point of the shot was. The idea is that they got out before it hit the Shield Gate.
The Sphyrna-class Hammerhead corvette, ‘Lightmaker’, under the command of Kado Oquoné, disables the ‘Persecutor’.
The Hammerhead corvette originated in Star Wars: Rebels, correct? It did, yeah. In fact, at one point, I went and met with Pablo Hidalgo from Lucasfilm’s Story Group, and basically asked him what types of ships might make up the Rebel Fleet at this point in history, that way we could start building them. If you think about it, there would be a lot of ships that would comprise the fleet at this point that you wouldn’t have seen in The Empire Strikes Back, or Return of the Jedi, for obvious reasons. A lot of them didn’t make it out. Pablo suggested the Hammerhead corvette, and I thought it looked great.
A layout shot of the Sphyrna-class Hammerhead corvette in Rogue One: A Star Wars Story.A progression shot as the sequence is further refined.The final shot in the film.
You also brought Hera Syndulla’s ship, the ‘Ghost’, over from Star Wars Rebels. What was the process like to bring a ship from animation into live-action? We got the geometry for both the Hammerhead corvette, and the ‘Ghost’ from the animation folks. Owing to the medium, they were built at a much simpler level of detail, with some aspects being a bit caricatured for the animation style. We slimmed the ‘Ghost’ down, and made “the movie” version of it, with a fairly extensive detail pass.
A render of Hera Syndulla’s modified VCX-100 light freighter, the ‘Ghost’.A final image of the ‘Ghost’ parked in the upper left at the Massassi outpost on Yavin 4 in Rogue One: A Star Wars Story.
Tell me about the kitbashing revival you did for this film? We had a lot of model shop veterans at ILM. John Goodson and Paul Huston were really helpful. Basically, I pitched this idea of building a Star Wars parts library, where we could essentially scan all of these model parts into a digital collection. Then we started to ask, “well, what are the right pieces? What are the right kits to pull from to give us the best bang of our buck?” The next step was actually sourcing these model kits. We set aside a budget and just bought a bunch of them on eBay; a lot of old vintage stuff. The Big Bertha howitzer, the Flak Wagon artillery gun, and a bunch of others that were used on the original films. We then photographed all of the sprue trees, and John Goodson went through and circled all of the ones we needed. We then laser scanned them all, and a partner of ours, Virtuous, then built really nice, optimized versions of all those pieces. That then became the basis of our Star Wars kitbash library, which we have gone on to use throughout the rest of the Star Wars projects we’ve done. For The Last Jedi, Roger Guyett’s team expanded the library even further with more model kits scanned in.
Modelmakers at ILM’s original location in Van Nuys working on Star Wars: Episode IV – A New Hope.Some of the vintage model kits acquired for Rogue One: A Star Wars Story.Model sprue trees ready for scanning at ILM.Scanned model pieces that make up ILM’s Star Wars kitbash library.
Tell me about how you captured the look of the original miniatures? A lot of it had to do with kind of making the model pieces look wonky, if that makes sense. If you’re putting a bunch of greebles in a row, don’t make them all perfectly straight. Turn one two degrees, and give them all a bit of jitter. Maybe break some of the edges so they’re a bit crooked and less perfect. The tendency is to think that a mile-long ship should be precision-made, but if you look at a real aircraft carrier up close, the hull has a little wobble in it, because it’s hard to make big stuff like that super precise. You can see it in the miniatures at the Lucasfilm Archives too, lots of wobble, and things where the greebles are misaligned. If you don’t have that imperfection in there, your eye will see it.
Jon Vander’s “Gold Squadron” forms up as they prepare for their assault on the Shield Gate during the Battle of Scarif.An Imperial I-class Star Destroyer loiters over Jedha City, while Imperial forces strip the settlement of its Kyber crystals.
Were there conversations around trying to match the lighting of A New Hope on the miniatures? Oh yeah. In fact, there was a philosophical discussion around to what degree we would match how the miniatures were lit. For the original trilogy, the miniatures were lit with stage instruments that were maybe twenty feet away, and that implies a certain spread angle on the light, and the size of the penumbra on the shadows. So the question was, “do we want to light these ships like there is a correctly scaled up 10K light that’s three miles away from them, preserving that original look, or do we want to light them from a sun that is one hundred million miles away, so the rays are more parallel?” Though it was tempting at times to go the other way, we ended up pushing more into the realism; same goes for the planets. We did very realistic renderings of the planets, which gave it the look and feel of the photography you might see aboard the International Space Station in orbit.
A GR-75 medium transport is obliterated while a Braha’tok-class gunship, two Y-wings, and an X-wing peel away.An Imperial I-class Star Destroyer emerges from the shadow of the Death Star’s Mk I Superlaser.
Tell me about how you brought Red Leader (Garven Dreis), and Gold Leader (Jon “Dutch” Vander), from A New Hope, played by Drewe Henley and Angus MacInnes, respectively, back into the film? If you think about it, we’ve brought back a number of characters, and we used every technique to do it. From A New Hope, Cornelius Evazan, Hurst Romodi, and Jan Dodonna were recast. Mon Mothma, from Return of the Jedi, was a recast (though Genevieve O’Reilly played Mon Mothma previously in a deleted scene from Revenge of the Sith). The creature shop brought Ponda Baba back from A New Hope. Anthony Daniels played C-3PO again. We brought R2-D2 back. Chopper was brought from animation into live-action. Jimmy Smits returned to play Bail Organa from the prequels, and James Earl Jones voiced Vader again. We of course had Grand Moff Tarkin and Princess Leia which were full computer graphics. Incorporating the unused footage of Red Leader and Gold Leader though was that final technique. That was discussed very early on about how this film is going to meet right up with A New Hope. We thought about how a lot of these people in the Battle of Scarif would have participated in the Battle of Yavin too, so we should see some familiar faces. It was hard to avoid Red Leader and Gold Leader, so we decided to look through all of the dailies from A New Hope to see if we can find anything. It was super fun to do, but it was harder than it looks, because the lighting style was pretty different back then. The footage was really grainy and had faded, so trying to get a shot from forty years earlier and drop it into the film without it looking “off” was a challenge. We de-grained it, and did loads of rotomasks. The orange color of the flight suits had to be boosted.
Red Leader (Garven Dreis), played by Drewe Henley, leads Red Squadron during the Battle of Scarif.Gold Leader (Jon “Dutch” Vander), played by Angus MacInnes, leads his squadron of BTL-A4 Y-wing assault starfighters.
[Laughs] One thing I discovered during this process, was that the digital X-wing cockpits we created were true and faithful to what the exterior looked like; I never noticed it before, but the cockpits that George had created had some huge cheats in their interior dimensions. The back window, for example, was made very tall so that George could get the shots looking over Luke at R2-D2. Ours didn’t match that at all, and it would have been jarring to jump between the digital cockpits and the archival cockpits. We had to rotoscope around Red Leader, and then insert him into the digital cockpit.
The archival footage of Red Leader.The digital T-65B X-wing cockpit from Rogue One.The final image from the film.Gareth Edwards sits atop the practical hull of a T-65B X-wing starfighter shooting interior cockpit shots.
We actually did it in reverse for Gold Nine, piloted by the character of Wona Goban, and played by Gabby Wong. She was shot in an X-wing, but we really wanted the Y-wings to be firing the ion torpedoes. Instead of creating a Y-wing cockpit from scratch, we simply rotoscoped her into the archival footage of the Y-wing cockpit that we had.
Gold Nine (Wona Goban), played by Gabby Wong, in an archival BTL-A4 Y-wing cockpit from A New Hope.
Red Five made an appearance too. He did. There had to be a reason Luke was assigned Red Five in A New Hope [laughs].
Cadet Pedrin Gaul, played by David Forman, piloting his T-65B X-wing starfighter under the call sign Red Five.
Was there a discussion about including Wedge Antilles at the Battle of Scariff? We discussed whether or not he should be in the battle, yes. But Wedge has that great line in A New Hope, “Look at the size of that thing!”, so it’s implied that he’s never seen the Death Star in person. We wanted to preserve that. Matthew Wood and Skywalker Sound actually brought in David Ankrum who overdubbed the voice of Wedge Antilles (played by both Denis Lawson and Collin Higgins in A New Hope) to voice Wedge again. If you listen to the comms chatter when the fleet is being scrambled, that’s Wedge telling the flight personnel to report and redirect to Scariff.
Wedge Antilles and “Fake Wedge” (Col Takbright in the canon universe), were both overdubbed by David Ankrum.
Tell me about Grand Moff Tarkin and Princess Leia. Were they always intended to be a part of this specific story? Tarkin was in my first treatment, and that conversation happened really early. I also really wanted to end the film with Leia. As the script was progressing, Kiri Hart told me that they wanted to prominently feature Tarkin, and asked how I felt about doing it. For Mon Mothma, the recast was perfect, because everyone remembers her lines from Return of the Jedi, but they don’t have a super clear picture of her in their head. They remember her robe, and the medallion she’s wearing, and her red hair, but if you recast and match those things, someone that looks a lot like her could fit into the role. Tarkin and Leia are so iconic, that you couldn’t do that with them. If they’re going to show up, you’d have to really match their likeness, and the only way to do that is with computer graphics.
John Knoll reviews dailies of the fearsome Wilhuff Tarkin in one of ILM’s View Stations.
Step inside the film — with Sprite, an Eternal, as your guide. Go on an epic Augmented Reality adventure through time and space to discover the truth about humanity. Enter the world, learn the backstory, and meet the characters, in Marvel Studios’ first Immersive Story Experience. This mini prequel lets you explore the story like never before and become a part of the action.
Marvel Studios, Industrial Light & Magic, and the Technology Innovation Group at Disney Studios Content have teamed up to bring the Eternals to your living room through an exciting augmented reality experience.
Industrial Light & Magic is thrilled to announce an exciting partnership with Disney’s Technology Innovation Group on Marvel Studios’ Eternals: AR Story Experience for the iPhone® and iPad®. In this augmented-reality app, the characters, world, and stories of the Eternals film have been brought to life like never before.
“I was so excited to get the call to come work on the Eternals: AR Story Experience,” said Danielle Legovich, Visual Effects Producer at ILM’s London studio. “What I found wonderful throughout the process was the incredible collaboration with the team at Disney, along with all of the content creators on the project. They were all such lovely people, and we were able to combine talent in such a profound way. With Disney coming from that background of games and apps, and ILM coming from the visual effects point of view—and having worked on a large portion of the Eternals film—it made for a really wonderful partnership. At ILM, we’re used to creating images that people view in a darkened cinema, so to be able to work on images that people would then bring into their home through AR was so much fun. The experience became so immersive for me personally during the creative process, that I would imagine this Deviant just exploding up from my kitchen floor. I really loved that.”
Film-quality VFX assets used in the app. Image courtesy of Disney Studios Content / Marvel Studios.
As you might expect, what makes this experience so incredibly unique is the augmented reality aspect. You’re able to literally step into that world and meet these characters without having to leave your home. That level of detail brought a host of exciting challenges. Edmund Kolloen, Computer Graphics Supervisor on the project, goes on to explain, “What was thrilling for me was trying to get the same quality that we would push out of a final render, and get that to look and act the same in a real-time application. The challenge, of course, was getting that data from our render package and pushing it to the pipeline at Disney. There was a lot of really great cross-collaboration on both sides, building it as we went along. The results were amazing. You can walk right up and have a look at these characters, because they’re the exact same digi-doubles from the film. We worked diligently to ensure that every facial expression and every emotion comes through.”
Recording actress Lia McHugh using over 100 cameras. Image courtesy of Disney Studios Content / Marvel Studios.
On the Disney front, the Technology Innovation Group developed a host of new cutting-edge tools and techniques during the iterative process of translating that data from ILM. Evan Goldberg, Manager, Technology Innovation Research at Disney Studios Content recounts, “We had a small but mighty team here at Disney to put this project together. Daniel Baker was the Producer, and was the beating heart and metronome of the project. I’ve been here at Disney for sixteen years, with a history of feature film production, animation, and VFX experience. My role on this project straddled the line between Tech Supervisor and VFX Supervisor. Both my team and Industrial Light & Magic really wanted to come from a place of authenticity for the experience, and to be as faithful as we could to the source material. When you see a still from the AR experience, it should feel like a still from the film, and we were able to do that by working directly with ILM. They were very open to adapting their pipeline to conform to what we needed on our end. That allowed us to collaborate more quickly, and make something that had a visual fidelity on par with the film, but rendered in a fraction of a second. It was so incredible to see new technologies born out of that process.”
Pre-visualization of in-app scene. Image courtesy of Disney Studios Content / Marvel Studios.
With all of the innovation, the teams still had a daunting undertaking. They had to create a cinema-quality AR experience, and one that would carry the Marvel Studios name on top of it. “We knew that we had to match the quality of what people see in the cinema,” explains Daniel Baker, Senior Producer and Manager, Technology Innovation at Disney Studios Content. “The iterative design process with ILM was so helpful, because it ensured that we were always working with the latest assets. The pre-visualization work, along with that review and iteration process, was really exciting. Since we were working from home, and across multiple time zones, we had to really make the most with the time we had. So to get everyone to go out into their backyard to play with the experience, and really give it that high level of scrutiny and pixel-by-pixel accuracy, was a lot of fun.”
Kolloen summarizes the overall Eternals: AR Story Experience perfectly, “one of the exciting things for me was to see the Deviants in that augmented reality environment. Nothing prepares you for the moment you walk outside with your iPad® and see this creature that’s the size of your house.”
Visual effects and virtual production pioneer Industrial Light & Magic today announced plans to expand its StageCraft LED virtual production services to Vancouver, British Columbia. The stage, at over 20,000 square feet, is expected to house one of the largest StageCraft LED volumes in North America. This will be ILM’s fifth permanent volume, supporting the growing demand for its state-of-the-art StageCraft services and facilities around the globe. In addition to Vancouver, ILM runs three stages in Los Angeles as well as one at Pinewood Studios in London.
“Vancouver is a fantastic production hub where we’ve received a lot of interest from our clients who want a StageCraft volume in British Columbia. It was a natural next step for our expansion,” noted Janet Lewin, SVP, and General Manager of ILM adding, “Our existing Vancouver visual effects studio is perfectly positioned to support virtual production and the new stage.”
A behind the scenes image from the ILM StageCraft LED Volume from season 2 of The Mandalorian.
“Our Canadian artists have contributed to digital set construction as well as Virtual Art Department work on a number of StageCraft projects so it’s really exciting that we will soon have a volume in our own backyard,” said Spencer Kent, Executive in Charge of ILM’s Vancouver studio. “I’m also proud that we are actively recruiting with an eye towards hiring people from underrepresented communities who we can train in the field of virtual production, strengthening our commitment to diversifying our industry. This effort will be bolstered when we launch our upcoming Jedi Training Academy in January.”
ILM won two Emmy Awards for their innovative real-time Visual Effects work for the first two seasons of Lucasfilm’s hit Disney+ series, The Mandalorian. They are currently in production on three episodic series and a feature film on their existing StageCraft volumes, in addition to daily bookings for smaller productions including commercials, music videos, and product marketing pieces.
ILM also builds bespoke StageCraft volumes for projects with unique requirements such as George Clooney’s The Midnight Sky and Taika Waititi’s upcoming Thor: Love and Thunder, which were built on stages in London and Sydney respectively. In addition, Disney Television Studios and ILM announced recently the opening of an additional StageCraft LED volume on the Disney lot in Burbank, built for episodic television production.
“We have seen explosive demand for StageCraft over the last few years since unveiling our real-time virtual production services in 2018, our creative and technical teams have really led our developments to design the most robust virtual production tools and techniques in the industry,” explains Rob Bredow, SVP and Chief Creative Officer of ILM. “Productions can leverage our proprietary Helios cinema engine or Unreal Engine within our StageCraft toolset and have the flexibility to create photoreal content with real-time flexibility for a wide range of shows.”
Carrie Underwood on the ILM StageCraft Volume at Manhattan Beach Studios during the filming of NBC Sports 2021/22 Sunday Night Football opening.
ILM intends for the Vancouver StageCraft volume to be operational in the spring of 2022. The company is actively recruiting for numerous virtual production roles including talent specializing in Virtual Art Department, digital environments, technical operations, and other real-time visual effects positions leveraging the StageCraft toolset.
In a new “Behind the Magic” video released on YouTube and Instagram, enjoy a glimpse behind the virtual production of NBC Sports’ Sunday Night Football show opening, featuring country music star Carrie Underwood.
“This was yet another successful demonstration of the end-to-end services available through ILM’s virtual production platform, ‘StageCraft™’,” says Chris Bannister, Executive Producer of Virtual Production at Industrial Light & Magic. “Partnering with the creative team from art concept all the way through principal photography, we were able to offer both the creative resources and real-world virtual production experience that maximized the scope and results for the project in a way that only ILM StageCraft can deliver.”
Shot on the StageCraft LED volume by Industrial Light & Magic.
The show opening is the key introduction each week to NBC’s flagship sports broadcast, and each year the creative team at NBC looks for innovative ways to top itself. 2021 was no exception, as they ideated ways to push the boundaries beyond the green screen and inject a new layer of authenticity and integration into the opener. That’s where ILM StageCraft came in.
“It was particularly important to Tripp Dixon and his creative team at NBC Sports to celebrate NFL fans coming back together,” notes Jonathan Howard, Associate Virtual Production Manager at ILM. “This unique opportunity allowed ILM to showcase both the agility, and production-hardened scalability of StageCraft 2.0, evident in the team’s ability to adapt the platform to the compressed schedule of a broadcast package.”
Shot on the StageCraft LED volume by Industrial Light & Magic.
Across the entire production, ILM was able to find unique ways to match the energy and excitement that Sunday Night Football fans are used to, while also expanding upon it in distinct ways. “This was such a rare creative project for me, because I’m typically working with creatures, droids, and spaceships,” said Hal Hickel, Animation Supervisor at ILM. “It was fun in that way though, because it got me out of my wheelhouse, while also allowing me to craft some exciting elements in a grounded production.”
What makes StageCraft’s application for the Sunday Night Football show opening different from previous applications of the technology, is that this project is designed to look both indistinguishable from the real world, and also fantastical in its execution. Hayden Landis, Visual Effects Supervisor at ILM explains, “We had some incredible streaming elements like fireworks, along with dynamic moving components that we’ve never used before on the volume. Between the creative and technical wizardry that the StageCraft crew conjured up on the day, and the passionate support of the NBC Sports team, I think we really created something special.”
Shot on the StageCraft LED volume by Industrial Light & Magic.
Even with all the magic happening on screen, it can be easy for viewers to miss StageCraft’s sleight of hand because it is so convincing. Hal Hickel elaborates, “To let them backstage in a creative way, we came up with the idea to have Carrie enter the studio in one take, walk up onto the set, and then have the entire StageCraft Volume power-up around her. That small addition really drove home the magic of StageCraft.”
Check out the “Behind the Magic” video below, and don’t miss the show opening this Sunday, October 17 at 5:20pm PST on NBC as the Seattle Seahawks face off against the Pittsburgh Steelers.
Behind the Magic – Sunday Night Football
In a new video released by ILM on our YouTube channel, join Visual Effects Supervisor, Richard Bluff, as he shares a peek behind the curtain of the effects of The Mandalorian: Season 2, winner of 7 Emmy® Awards including Special Visual Effects, Sound Mixing, Cinematography, Prosthetic Makeup, Stunt Coordination, Stunt Performance, and Music Composition.
For its sophomore outing, Lucasfilm’s hit Disney+ series built upon the groundbreaking technical and artistic achievements accomplished during season one, combining traditional methodologies, with ever-advancing new technologies. The team also increased the physical size of the ILM StageCraft™ LED Volume which would again be used for over half of all scenes. This season also marked the debut of ILM’s state-of-the-art real-time cinema render engine called, Helios. The high-resolution, high-fidelity engine was used for all final pixel rendering displayed on the LED screens and offers unmatched performance for the types of complex scenes prevalent in today’s episodic and feature film production.
Practical creature effects have been a vital part of the aesthetic and charm of the Star Wars universe since 1977, and for season two, the effects team realized over 100 puppeteered creatures, droids, and animatronic masks, which included the beloved Tatooine Bantha, realized as a ten-foot-high puppeteered rideable creature.
Practical miniatures and motion control photography were used once again for scale model ships, as well as miniature set extensions built for use in ILM’s StageCraft LED volume. Stop-motion animation was also utilized for the Scrap Walker at the Karthon Chop Fields. The greater Krayt dragon on Tatooine was realized as a six-hundred-foot computer-generated creature that would swim shark-like through the sand environment by way of a liquefaction effect, wherein the sand would behave like water.
We would like to acknowledge the care and dedication that the team here at ILM put into the show, along with our partners at Legacy Effects, Hybride, Image Engine, Important Looking Pirates, Ghost VFX, Lola, Stereo D, Tippett Studios, Base FX, Raynault, Virtuous, and Yannix.
We hope you enjoy this look inside The Mandalorian: Season 2.
The Jedi Academy is a unified, global, 12-week junior talent paid internship and trainee program at Lucasfilm, Industrial Light & Magic, and ILMxLAB created for students and graduates. The program is a once-in-a-lifetime experience to learn in a dynamic and creative production environment, focused on developing the next generation of diverse talent across art, public relations, and technology.
After playing Vader Immortal, I knew that I wanted to help make those kinds of games and tell those kinds of stories,” said Gary Walker, intern at ILMxLAB. “So if you want to do something, go for it. Ask how you can get there because there are people willing to help you if you’re willing to go out and you’re willing to do it.”
Jedi Academy interns are able to gain valuable, real-world experience through hands-on training and mentorship across day-to-day production work. Trainees also gain valuable skills through intensive classes and immersive learning modules taught by industry experts from a variety of disciplines. The trainees are exposed to fundamental artistic concepts as well as key business skills that support their transition into the industry.
“Coming into this I was very interested in a lot of things; VR, animation, video production, film production,” said Jared Tan, Video Production Intern at Lucasfilm. “And now coming out of the internship, I know what skills I need to polish so hopefully one day I can come back here to work and help this ecosystem of filmmakers and creative people at this amazing company.”
Lucasfilm is committed to improving the diversity of our studios, and programs like our Jedi Academy help us provide opportunities to a broad range of applicants at the start of their careers. The experience is perhaps best described by Alexandria Frank, Studio Talent Group Intern at Lucasfilm, “Just the sheer intention and passion that comes with everyone working here, it radiates through everything.”
The most recent Jedi Academy interns for ILM focused on virtual production and the company’s StageCraft technology, an ever-growing part of the company’s business. The company is preparing to launch another Jedi Academy focused on the San Francisco and Vancouver studios soon.
Would you like to Join the Force? Keep your eyes peeled on our Careers page for when we announce our next Jedi Academy.
Industrial Light & Magic (ILM), continues to push the boundaries of what’s possible in virtual production with StageCraft. In a new video released by ILM on our YouTube channel, we shed light on the process and advances made for the second season of the series with interviews from a variety of the key filmmakers involved. “(StageCraft) gives us the opportunity to bring that production scale, size, scope; that kind of expansive storytelling that we have never been able to shoot before”, said Kathleen Kennedy, President of Lucasfilm Ltd.
Rob Bredow added, “It really is an end-to-end solution; everything from the early stages of previz and working with the art department, all the way through motion capture, and now, working with live pixels on set. What you see on that LED wall is actually what goes straight into the show.” This toolkit also enables filmmakers like Jon Favreau, and Dave Filoni, to realize their vision faster and more efficiently. “We could actually get in-camera, finished visual effects that would really help us with the quick turnaround that television requires,” said Favreau.
Now with StageCraft version 2.0, advancements have been made across the platform, including the addition of a suite of tools specific to filmmaking and ILM’s groundbreaking real-time cinema render engine, Helios. Filmmakers are now able to capture high-fidelity visual effects that are rendered in real-time, and indistinguishable from the physical environment. This provides a deeply immersive experience for both filmmakers and actors alike, enabling them to harness seamless interactive light on the physical environment, and thousands of in-camera visual effects finals.
“The DP’s, the directors, and the actors can now see the world they’re in, and interact with that world the way they would if they were out in a real location,” said Richard Bluff, Visual Effects Supervisor on The Mandalorian. Adding, that filmmakers can leverage traditional techniques such as “ Focus pulling, pushing a real camera, getting the interaction from the lighting from the LED’s onto the actors.”
This incredible immersion also extends to how these environments are scouted and blocked. Bryce Dallas Howard, Director of The Mandalorian explains, “Locations are created, because they’re digital assets, essentially, so the way that you scout is you do it in VR; basically blocking out your scene in this virtual world.” Taika Waititi, voice actor for IG-11, and Director of The Mandalorian adds with a laugh, “It’s just a fresh new way of looking at stuff, and the way that you can decide on the landscape as well. It’s a great throwback to a time when people would make decisions before post.”
For more information about StageCraft or to discuss your project, contact us at: contact-StageCraft@ilm.com
Interested in joining our Virtual Production team? Visit our Careers page to see the current opportunities.
ILM StageCraft 2.0 and Helios combine to bring unprecedented fidelity, power, and flexibility to the filmmakers on the second season of Lucasfilm’s Emmy Award-winning hit Disney+ series.
Industrial Light & Magic today announced the next phase of its global expansion plan for the company’s virtual production and StageCraft LED volume services. This expansion of services is tied to a proactive initiative for increasing diversity in the industry by combining ILM’s growth in this innovative methodology with a global trainee program geared for underrepresented VFX talent.
ILM’s existing StageCraft volume set at Manhattan Beach Studios (MBS) was used for the Emmy nominated series The Mandalorian and will soon be joined by a second permanent StageCraft volume set at the studio, servicing a variety of clients in the greater Los Angeles area. In addition, ILM is building a third permanent StageCraft volume at Pinewood Studios in London, and a fourth large-scale custom volume at Fox Studios Australia to be used for Marvel’s highly anticipated feature Thor: Love and Thunder directed by Taika Waititi. ILM will also continue to provide “pop up” custom volumes for clients as the company recently did for the Netflix production The Midnight Sky, directed by George Clooney.
An end-to-end virtual production solution, ILM StageCraft is a production-hardened technology that provides a continuous pipeline from initial exploration, scouting, and art direction, traditional and technical previsualization, lighting, and of course, real-time production filming itself, with the innovative StageCraft LED volumes. Lucasfilm’s hit Disney+ series, The Mandalorian, and a highly anticipated feature film took advantage of the full complement of ILM StageCraft virtual production services. Other projects such as Avengers: Endgame, Aquaman, Jurassic World: Fallen Kingdom, Battle at Big Rock, Rogue One: A Star Wars Story, Kong: Skull Island, Solo: A Star Wars Story, Ready Player One, and Rango, have utilized aspects of the toolset as well.
By every measure, the new stages are vast improvements over the original ground-breaking LED volume developed for the first season of The Mandalorian in 2018. Physically, the new stages are larger, utilizing substantially more LED panels than ILM’s original stage and also offering both higher resolution and smooth wall to ceiling transitions – this directly results in better lighting on set as well as many more in-camera finals. ILM’s proprietary solutions for achieving groundbreaking fidelity on the LED walls at scale allows for higher color fidelity, higher scene complexity, and greater control and reliability.
“With StageCraft, we have built an end-to-end virtual production service for key creatives. Directors, Production Designers, Cinematographers, Producers, and Visual Effects Supervisors can creatively collaborate, each bringing their collective expertise to the virtual aspects of production just as they do with traditional production,” explained Janet Lewin, SVP, GM ILM. Rob Bredow, CCO, ILM added “Over the past 5 years, we have made substantial investments in both our rendering technology and our virtual production toolset. When combined with Industrial Light & Magic’s expert visual effects talent, motion capture experience, facial capture via Medusa, Anyma, and Flux, and the innovative production technology developed by ILM’s newly integrated Technoprops team, we believe we have a unique offering for the industry.”
Alongside the new stages, ILM is rolling out a global talent development initiative through the company’s long-standing Jedi Academy training program. The program, which is part of the company’s larger Global Diversity & Inclusion efforts, offers paid internships and apprenticeships on productions with seasoned ILM Supervisors and Producers who serve as mentors. The program is intended to fill roles across the virtual production and VFX pipeline with those from traditionally underrepresented backgrounds; ILM has posted expressions of interests for jobs across the spectrum, from virtual art department teams and production management to engineering and artist roles. The goal with this initiative is to attract diverse junior talent and create a pipeline for them to become future Visual Effects artists, technicians, and producers who will be “ILM trained” and uniquely qualified to work in this new, innovative way of filmmaking.
“There is a widespread lack of diversity in the industry, and we are excited to leverage our global expansion in this game-changing workflow to hire and train new talent, providing viable, exciting, and rewarding jobs across many of our locations,” noted ILM VP, Operations, Jessica Teach, who oversees the company’s Diversity and Inclusion initiatives. “We believe this program can have a multiplier effect, attracting even more diverse talent to the industry and creating a pipeline for visual effects careers. We know that bringing more diversity into the industry is a critical part of strengthening and expanding our storytelling potential.”
ILM expects to have the new stages up and running for production in London in February of 2021 and in Los Angeles in March, with a mix of projects from features to commercials in line to take advantage of them. The company is currently fielding inquiries for future bookings by studios and filmmakers. For more information or to express interest in the Jedi Academy program visit our careers site.
OpenEXR, a widely-adopted HDR image file format, and OpenCue, a recently launched render manager, join the growing roster of Academy Software Foundation projects.
We’re thrilled to announce that the Academy Software Foundation (ASWF), a neutral forum for open source software development in the motion picture and media industries, today announced that OpenEXR and OpenCue have been accepted by the Technical Advisory Committee (TAC) as Academy Software Foundation projects alongside OpenVDB and OpenColorIO.
Initially developed by ILM, OpenEXR is an Academy Scientific and Technical Award winning high dynamic-range (HDR) image file format for use in computer imaging applications. It is a widely-adopted standard in computer graphics for linear and interactive media.
OpenCue is a fully featured, open source render manager for media and entertainment that can be used to break down complex jobs into individual tasks. Developed in collaboration by Google Cloud and Sony Pictures Imageworks, OpenCue is an evolution of Sony’s internal queuing system, Cue 3.
“This announcement marks a new phase for the Academy Software Foundation. We’ve achieved our initial goal of accepting OpenVDB, OpenColorIO, and OpenEXR – projects which greatly influenced the Foundation’s formation – and we are now ready to support and drive collaboration around newer projects like OpenCue,” David Morin, Executive Director of Academy Software Foundation. “Studios and developers are finding value in having a neutral home for the open source projects that our industry relies on, and we look forward to growing our projects and continuing to find new ways to support to the broader open source community.”
OpenEXR and OpenCue join OpenVDB and OpenColorIO as projects in the incubation stage at the Academy Software Foundation. All newly accepted projects start in incubation while they work to meet the high standards of the Academy Software Foundation and later graduate to full adoption. This allows the Academy Software Foundation to consider and support projects at different levels of maturity and industry adoption, as long as they align with the Foundation’s mission to increase the quality and quantity of contributions to the content creation industry’s open source software base.
Cary Phillips, Lucasfilm Research & Development Supervisor and Academy Science and Technology Council member noted, “The Academy Software Foundation was created with OpenEXR in mind, recognizing that there’s a natural life cycle to software projects: original architects and developers move between companies, expertise spreads throughout the industry, and the entire VFX technology ecosystem rapidly evolves. The ASWF has brought together virtually every major company in the industry, and it provides a vital forum to discuss sensible, practical solutions that should ensure that OpenEXR continues to serve the industry as a stable and reliable standard.”
OpenEXR One of the foundational technologies in computer imaging, OpenEXR is a standard HDR image file format for high-quality image processing and storage. It features higher dynamic range and color precision than existing 8- and 10-bit image file formats, and the latest version of OpenEXR supports multiple image compression algorithms, stereoscopic workflows, multi-part files and deep data.
“For us, the single most important thing we create are the images that we put on screen, and we’ve all come to trust the OpenEXR format with our most precious data. ILM’s decision over 15 years ago to make EXR available as an open source project for the filmmaking community arguably set in motion an industry-wide trend that fostered collaboration and shared advancement, eventually culminating in the creation of the Academy Software Foundation. We’re proud to contribute OpenEXR to a new home to ensure it remains a robust and stable project for years to come,” said Francois Chardavoine, Head of Production Technology, Industrial Light & Magic.
OpenEXR was developed in 1999 by ILM in response to the demand for higher color fidelity in the visual effects industry. It was released to the public as an open source library in 2003, and it has since been widely-used and maintained through code contributions from companies including Weta Digital, Walt Disney Animation Studios, Sony Pictures Imageworks, Pixar Animation Studios, Autodesk, and DreamWorks, among others. OpenEXR was honored with an Academy Scientific and Technical Award in 2007.
OpenEXR is ILM’s main image file format and has been used in all motion pictures that ILM contributes visual effects work to since 2000. The first movies to employ OpenEXR were Harry Potter and the Sorcerer’s Stone, Men in Black II, Gangs of New York, and Signs. Recent films include Solo: A Star Wars Story, Avengers: Infinity War, Black Panther, and Star Wars: The Last Jedi.
Developers interested in learning more or contributing to OpenEXR can visit the OpenEXR Github page.
Lucasfilm and Industrial Light & Magic (ILM) announced the inaugural Open Source release of the MaterialX Library for computer graphics. MaterialX is an open standard developed by Lucasfilm’s Advanced Development Group and ILM engineers to facilitate the transfer of rich materials and look-development content between applications and renderers.
Industry-leading companies including Autodesk, Inc. and Foundry have voiced support for MaterialX.
Chris Vienneau, Director of Media and Entertainment at Autodesk noted, “Autodesk is very pleased to be a contributor to the MaterialX project and we are looking forward to adding native support for MaterialX workflows to our digital content creation tools. As with other open formats, MaterialX is going to improve collaboration and help make production pipelines more efficient, so we are thankful that Lucasfilm have chosen to share their technology with the community through open source software.”
“Foundry is happy to see the MaterialX project reach this latest milestone,” said Jordan Thistlewood – Senior Product Manager: Look Development and Lighting at Foundry. “The possibilities for smoothing the transfer of look development information between our own applications is exciting. The broader principle of open source projects and multi-vendor data exchange are important for the industry as a whole. Thus we look forward to including MaterialX powered workflows in future releases of our applications.”
Originated at Lucasfilm in 2012, MaterialX has been used by ILM in feature films such as Star Wars: The Force Awakens and Rogue One: A Star Wars Story, and real-time immersive experiences such as Trials On Tatooine. The MaterialX team will host a ‘Birds of a Feather’ meeting at the ACM SIGGRAPH Conference in Los Angeles, CA on Monday July 31 2017, 9:30-11:00am, in room 511BC of the Los Angeles Convention Center.
Workflows at Computer Graphics production studios require multiple software tools for different parts of the production pipeline, and shared and outsourced work requires companies to hand off fully look-developed models to other divisions or studios which may use different software packages and rendering systems. There are currently high-quality solutions (e.g. USD, Alembic) for exchanging scene hierarchies and geometric data between tools, but no effective solutions for exchanging rich material content. MaterialX addresses the current lack of a common, open standard for representing the data values and relationships required to transfer the complete look of a computer graphics model from one application or rendering platform to another, including shading networks, patterns and texturing, complex nested materials and geometric assignments. MaterialX provides a schema for describing material networks, shader parameters, texture and material assignments, and color-space associations in a precise, application-independent, and customizable way.
Lucasfilm is no stranger to open source projects having developed and released key projects over the years. The company has played a key role in developing Alembic, co-developed with Sony Imageworks and released in 2012, and OpenEXR, which was developed by ILM in 2000. Both have become industry standards and continue to be developed by the open source community.
MaterialX is an Open Source project released under a modified Apache license. For more information visit the MaterialX website: www.materialx.org and follow MaterialX on Twitter @MaterialXCG for the latest news.