This morning the Hollywood Professional Association unveiled the HPA Award Creative Category nominees and ILM received seven nominations across three categories. In the Outstanding Visual Effects – Live Action Feature category, ILM received four of the five nominations.
For the Live Action Feature category, ILM was nominated for Alien: Romulus (Nelson Sepulveda-Fauser, Ale’ Melendez, Sebastian Ravagnani, Nicolas Caillier, Steven Denyer), The Creator (James Clyne, Trevor Hazel, Keith Anthony-Brown, Danielle Legovich, David Dally), Deadpool & Wolverine (Vincent Papaix, Georg Kaltenbrunner, Alexander Poei, Ziad Shureih, Russell Lum), and A Quiet Place: Day One (Malcolm Humphreys, Jordan Harding, Charmaine Chan, Michael Lum, Steve Hardy).
For Outstanding Visual Effects – Animated Feature, ILM received a nomination for Ultraman: Rising (Hayden Jones, Stefan Drury, Sean M. Murphy, Mathieu Vig, Kyle Winkelman). This nomination was one of just two in this category; our friends at Pixar received the other nomination for Inside Out 2.
In the Outstanding Visual Effects – Live Action Episode or Series Season category, ILM received two of the five nominations for: Loki – Season 2 (Steve Moncur, Christian Waite, Jeremy Sawyer, Ben Aickin, Pieter Warmington) and Percy Jackson and the Olympians – Season 1 (Erik Henry, Matt Robken, Jeff White, Jose Burgos, Donny Rausch).
The HPA Awards Gala will take place on November 7, 2024, at the Television Academy’s Wolf Theater in Hollywood.
Nominations for the 76th Emmy® Awards were announced today from the historic El Capitan Theatre in Hollywood, recognizing remarkable programs, extraordinary performances and impactful storytelling across multiple platforms. Three projects the ILM contributed visual effects and animation work on were among the nominees.
The nearly 22,000 voting members of the Academy nominated an abundance of inspiring talent and a robust selection of diverse program offerings. In a year marked by significant challenges and changes in the Television landscape, the nominations recognize the excellent work of performers, producers, writers, directors, craftspeople, and professionals above and below the line on television programs from the 2023 – 2024 eligibility year.
Among the honored projects for Outstanding Special Visual Effects In A Season Or A Movie, was Lucasfilm’s hit Disney+ series Ahsoka.
Nominees include:
Richard Bluff, Visual Effects Supervisor, Production Jakris Smittant, Visual Effects Producer, Production Paul Kavanagh, Animation Supervisor, Production TC Harrison, Associate Visual Effects Supervisor Scott Fisher, Special Effects Supervisor Enrico Damm, ILM Visual Effects Supervisor Justin van der Lek, ILM Associate VFX Supervisor Rick O’Connor, ILM Animation Supervisor J. Alan Scott, Legacy Effects Supervisor
In the Outstanding Special Visual Effects In A Single Episode category episode 4 of Netflix’s All The Light We Cannot See made the list.
Nominees include:
Paolo Acri, Visual Effects Supervisor & Laurence Berkani, VFX Producer
And finally, in the Outstanding Special Visual Effects In A Season Or A Movie category, Marvel Studios’ Loki received a nomination.
Nominees include:
Steve Moncur, VFX Supervisor
The 76th Emmy Awards will broadcast live on ABC on Sunday, September 15, (8:00-11:00 PM EDT/5:00-8:00 PM PDT) from the Peacock Theater at L.A. LIVE and stream the next day on Hulu. The 76th Creative Arts Emmy Awards take place at the Peacock Theater over two consecutive nights on Saturday, September 7, and Sunday, September 8, with an edited presentation to air on Saturday, September 14, at 8:00 PM EDT/PDT on FXX.
Join creator Dave Filoni, Production Visual Effects Supervisor Richard Bluff, Animation Supervisor Paul Kavanagh, and Visual Effects Supervisor Enrico Damm for a roundtable discussion on the visual effects of Lucasfilm’s hit Disney+ series ‘Ahsoka’.
After 25 years at ILM, Cooper has earned a reputation for seeking out the most efficient solutions to creative problems.
By Lucas O. Seastrom
Back in 2002, Industrial Light & Magic’s Jay Cooper was a compositing sequence supervisor on Master and Commander: The Far Side of the World (2003). For a time, director Peter Weir joined the ILM crew at their offices on Kerner Boulevard in San Rafael, California. “We had a shot when the mast of one of the ships falls over,” Cooper tells ILM.com. “There’s all this gunfire. It’s completely enshrouded in smoke. As I’m working on it, Weir comes to my desk and he says, ‘I want it to look like a beautiful nightmare.’ I was like, ‘Wow, that’s cool. Now what does that look like?’ [laughs]”
Over the past two decades, Cooper has moved into the visual effects supervisor role, working on projects as varied as Eternals (2021) and Babylon (2022). Most recently, he partnered with writer/director Gareth Edwards on The Creator (2023), a science-fiction tale with an unconventional visual effects methodology. As he and the ILM crew navigated the challenges of integrating effects into location photography with minimal reference data, Cooper managed to connect with Edwards in a way that reminded him of his experience with Peter Weir.
“Normally, as a visual effects supervisor, you’re being much more granular in your notes, lots of technical conversations,” Cooper says. “You don’t usually engage with artists in an emotional way. That’s what is really wonderful when you’re exposed to working with directors. That’s my favorite part of being a supervisor: you’re not always in the weeds talking about those details, you’re trying to engage with it at a story level. That’s the part that artists love. Gareth partnered with us in that way, and people got really excited about the project. Fun things happen when people get excited. They sneak in extra takes. They devote themselves in a huge way. We asked people to do really hard stuff without all of the support materials. If they know what we’re trying to achieve and we’re all pulling together, it can help make up for those shortcomings.”
At the beginning of the project, ILM’s chief creative officer Rob Bredow asked Cooper to meet with Edwards and producer Kiri Hart. “Gareth said, ‘Hey, I’ve got this movie and I hear you’re the guy who likes to cheat,’” Cooper says with a laugh. “He said that probably in the most affectionate way. I’m not really a devotee of any sort of process. I worship at the altar of whatever we can do as quickly and as simply as we can do it. As an artist, that was my forte. I did lighting and compositing, and I would try to navigate as many shortcuts as I could. I guess my reputation as a visual effects supervisor was that I’d work on shows with really small budgets and we’d try to wring out whatever production value we could. I think that’s why Rob put us together.”
Director Gareth Edwards operating the camera on location in Asia during production on The Creator.
Edwards’ vision and Cooper’s style were in tandem. In terms of workload, The Creator would be Cooper’s biggest project to date as a visual effects supervisor. “One of the best pieces of advice that [ILM executive creative director] John Knoll ever gave me,” Cooper notes, “was that you take big problems, break them into smaller problems, and smaller and smaller. So we created teams to hit different problems. We knew that we were going to be behind the 8-ball. We knew that Gareth had a smaller-than-desired budget, and he came to us wanting to partner in a different way.”
Edwards had been a visual effects artist himself before taking the director’s chair full-time. In his 2010 feature directorial debut, Monsters, he famously created many of the visual effects on his own. For Cooper, this practical experience helped define ILM’s approach to crafting visual effects with a “scrappy” sensibility. Shooting primarily on location in Thailand, Edwards focused on capturing his actors and the dramatic landscapes where they played out their scenes. Traditional effects tools like bluescreens and tracking markers would be almost completely avoided, and ILM would need to integrate their CG elements without the normal reference tools.
Looking into the ILM StageCraft volume during production on The Creator.
“Most of the time doing visual effects work, it’s very much a spreadsheet problem. You have seven robots at this amount of money, or fifteen environments at this scale at this amount of money. Even at the bidding stage for The Creator, we were instead asking what we could do for a certain amount of money. Just as a scrappy filmmaker, Gareth wanted to know what was possible in visual effects if we used different techniques and structured the show differently.
“If we take a whole sequence,” Cooper continues, “Gareth would explain how there’s only so much information you can take in during one shot, so let’s put everything together, bring it all up, and water the one element that’s dying. If you didn’t feel like there were enough robots here, how much do you need to add? Where’s your eye going to go? If a frame feels empty, what can we add? Is there a way to add something that avoids a roto-nightmare? Can we structure it so we don’t see the element in one shot but we do see it in the next two shots so that you sort of complete what the image is? Loosely, that’s how we went off and did the work.”
Much of that questioning and analysis was open to the larger visual effects crew. Initially, Edwards had planned to embed himself within ILM’s studio to personally oversee the work. Although pandemic concerns ultimately scratched that idea, he still welcomed artists from deeper in the ranks to present their work directly and share ideas.
Gareth Edwards discusses a scene with John David Washinton in the ILM StageCraft volume.
“It takes a rare person to be comfortable enough to share your feedback openly with artists on the production,” Cooper notes. “It’s really wonderful. You get a level of engagement that you may not always find. Sometimes working on blockbusters, you can feel like you’re just punching numbers. But if you expose the artists to the reasoning behind something, the filmmaking intent, you get a huge level of engagement.”
As visual effects supervisor for the entire production, Cooper was busy overseeing work not only at ILM’s studios in San Francisco, London, Sydney, and Vancouver but also the assortment of smaller vendor studios enlisted to assist on the project. The initial shot count estimate had more than doubled by the time Edwards shared his initial cut. As he points out, ILM contributed “about 95% of the asset work and the lion’s share of the shot work” with the support of the vendors.
“As a supervisor, I’m sort of tapping the boat,” Cooper says. “You can’t be in every single file to model the rivets. You can’t go into every composite to add the elements. You’re asking for degrees of one thing or another, and there are a lot of places where people are volunteering an idea. They’re doing it in a way that they understand what the stylistic or aesthetic goal is.”
Overall, Cooper’s experience on The Creator felt like a return to an earlier era in visual effects, one that speaks directly to ILM’s can-do spirit. “ILM tries to find projects that are outside of the comfort zone of what has happened previously. It must have been wonderful in the late ‘80s or early ‘90s when the question wasn’t ‘can you do this?’ It was, ‘is this even possible?’ Those times have ended in many different ways. You do it enough times, and there’s a cost structure around it. So it’s interesting to be on a project where you chuck a lot of that away and get back to the basest level. We have a pot of money and a director with some big ideas. That’s the launching point. It’s cool and exciting to be in that world again.”
Lucas O. Seastrom is a writer and historian at Lucasfilm.
Wednesday evening, the Visual Effects Society (VES), the industry’s global professional honorary society, held the 22nd Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry, and innovation in film, animation, television, commercials, video games, and special venues.
ILM’s work on The Creator earned six nominations and won four awards including the coveted top prize Outstanding Visual Effects in a Photoreal Feature, Outstanding Created Environment in a Photoreal Feature,Outstanding Model in a Photoreal of Animated Project, and Outstanding Effects Simulations in a Photoreal Feature, with a partner company winning a fifth award for the film for Outstanding Compositing and Lighting in a Feature.
ILM’s work on Darren Aronofsky’s groundbreaking film Postcard from Earth won for Outstanding Visual Effects in a Special Venue Project while The Mandalorian (Season 3) won forOutstanding Effects Simulations in an Episode, Commercial, Game, Cinematic or Real-Time Project.
“It’s a true testament to our amazing global teams that our work was honored by our industry colleagues on the winning shows noted above as well as the ILM shows that were nominated, including Indiana Jones & The Dial of Destiny, Ahsoka, Willow, Mission: Impossible, Dead Reckoning Part One,Dungeons & Dragons: Honor Among Thieves, Napolean, Killers of the Flower Moonand Guardians of the Galaxy, Vol. 3” said Janet Lewin, ILM General Manager, “I couldn’t be more proud of our teams.”
Visual effects supervisors Jay Cooper, Andrew Roberts, Charmaine Chan, and Ian Comley take us behind-the-scenes of an unusual visual effects challenge.
By Lucas O. Seastrom
Ever since George Lucas and John Dykstra sat down in 1975 to discuss the former’s vision of capturing dynamic aerial dogfights between miniature spaceships in Star Wars: A New Hope (1977), Industrial Light & Magic (ILM) has made an art of solving creative problems in close partnership with filmmakers. As Lucas’ vision challenged ILM’s capabilities nearly 50 years ago, The Creator (2023) writer/director Gareth Edwards proposed an unconventional approach to filmmaking that would keep the visual effects crew on their toes.
Proof of Concept
Edwards first collaborated with ILM on Rogue One: A Star Wars Story (2016), channeling the same rebel spirit of Lucas’ A New Hope. Envisioning his own science-fiction tale in The Creator, he would also channel Lucas’ audacity for pushing the limits of ILM’s capabilities. It began some years ago when he asked ILM’s executive creative director John Knoll (who supervised the visual effects for Rogue One) if the company was able to assist with a test reel that would demonstrate Edwards’ vision for a movie about a futuristic Earth where humans and artificial-intelligence lived side-by-side.
“Gareth and his producer [Jim Spencer] went to Asia on what he described as a scout, but he also brought a camera along,” explains Jay Cooper, who would become The Creator’s overall visual effects supervisor for ILM. “He shot in a number of different locations to create a sort of think-piece, very documentary-style footage. Then he came to us asking to put some 50 shots together, which John supervised.”
Edwards provided his footage only. There was no accompanying data, no lidar scans, no HDRI captures of environments, none of the usual resources that visual effects artists rely upon. The challenge was to integrate digital elements – characters, vehicles, and locations – into the existing footage, including the replacement of real people, or components thereof, with robotic technology. “We tried to create rapid prototypes of what shots could look like by doing them in a more heavily-2D way,” Cooper explains. “We’d take frames, do a draw-over with James Clyne, who became the film’s production designer, and with a bit of fast projection work, get them into shots. We got a really convincing look with a modest amount of effort. Gareth explained that shots that usually take two or three months of work could be seen in three or four days.”
The proof-of-concept not only sold Edwards’ backers to make the film, but gave ILM a model for developing effects on a feature-length scale in this unusual, after-the-fact method. The Creator would be shot primarily on location in Thailand with a small crew and fewer resources. “Gareth wanted to shoot this ambitious movie,” says Cooper. “The artwork was phenomenal, but the catch was that we’d be really uncomfortable because we wouldn’t be given the things we were used to. We wouldn’t stop for a clean pass every time. We wouldn’t always know what all the shots we’re going to be because those would be determined in the edit. There were enormous designs for the scope of the movie. This was a big swing, it was sink or swim. So off we went.”
On the Ground in Thailand
Edwards remained committed to maintaining a fast, improvisational shooting style, often handling the camera himself. He would not inhibit his ability to engage in the moment with his actors, who included John David Washington as the protagonist Joshua, a world-weary soldier in search of his lost love, and Madeleine Yuna Voyles as Alphie, an artificial simulant in the form of a young girl who acts as both the story’s heroine and MacGuffin. Instead of the usual small team of visual effects personnel, ILM would send just one representative to Thailand, visual effects supervisor Andrew Roberts.
Roberts would be responsible for both consulting with Edwards and crew, including cinematographers Greig Fraser and Oren Soffer, as well as capturing as much data for each respective shot as he possibly could. “I was there to help make sure that things were filmed in a way that would give ILM the best chance of producing great, photoreal work,” Roberts says. “I wasn’t going to get in Gareth’s way.
“Early on, we had scenes with robots and humans existing together,” Roberts continues. “I asked Gareth which of the actors would be made into robots so I could mark them. Even if we’re not putting them in the motion-capture suits, I could take measurements and make a turntable, all to give the team information. Gareth looked at me and said, ‘Don’t know.’ It wasn’t something he wanted to focus on. He would pick actors to make into robots later. I wasn’t sure how aggressively he was going to create negative space with these characters. It turned out that their bodies were more or less the same, and you’d mainly see the mechanism when it came to their arms and their heads. But I still didn’t know at the time, so I recorded the information about where Gareth was pointing the camera and determining what backgrounds I needed to capture to reconstruct a clean plate.”
Another major challenge involved the simulants, A.I. characters who appear human, save for the aft portion of their heads, which feature a bold mechanical structure. Edwards and Clyne had created initial concept art, but it was left to Roberts and Cooper to determine the best means to track the live actor’s facial movements onset in order to integrate digital components during post-production.
“I think the movie doesn’t work at all if you can’t get a convincing Alphie,” says Cooper. “It’s where your eye is looking. There are 400 shots of her. In prep, I pitched the idea of putting a sock over her head and dressing the edges of where the contours are so that we know exactly how to define the delineation point between where her mechanical components connect to her skin. I asked about doing makeup to address the edges, and we could fill in the rest. Gareth said, ‘No, we’re not going to do that because I need her.’ When you’re working with a child actress, there’s only so many hours you can work. He wanted her onset for every minute she could be.
“Then we had to figure out what we could do in terms of tracking dots that were low impact and didn’t interfere with the acting,” Cooper says. “I explained the ask to [layout supervisors] John Levin and Tim Dobbert, and said that I didn’t know exactly what the designs were going to be, but they said, ‘Well, let’s put some tracking dots on the bridge of her nose, one on the temple, a couple on her neck, and we think we can figure that out.’ So that’s what we did! [laughs]It’s a leap of faith.” Roberts then collaborated daily with the makeup department to place tracking dots on the simulant actors, each of whom required a unique arrangement because of their varying physiques.
“The benefit of having someone like Gareth is that he used to be a visual effects artist and he has a clear idea of what the end result will be,” explains Roberts. As an example, he explains how Edwards shot an early moment in the film when Joshua watches the suborbital ship NOMAD launch a missile at a group of small vessels just offshore. “Gareth knows that he wants NOMAD to be in frame, so he’ll frame for it and then tilt down to Joshua watching from the beach. Another director might be focused on the action in front of them, and in post they’ll ask if we can extend the frame and create a digital move. The majority of directors don’t think about those things in advance. So when I’m observing a shot like that where Gareth is tilting the camera, I’ll wait for the cut and ask him, ‘During that tilt, what are you seeing?’ Then I’ll make notes.”
After crisscrossing Thailand, often covering multiple locations in a single day, cast and crew traveled to Pinewood Studios in the United Kingdom, where ILM had contstructed a StageCraft volume as part of its virtual production toolkit. There, two major sequences were captured for the end of the film, when Joshua and Alphie board NOMAD. “It takes a lot of work to do StageCraft correctly,” notes Cooper, who used the tool for the first time on this show (as was the case for Edwards). “You have to be very careful that it’s the right fit. If I think about our goal as a movie, which was to always find real locations, there were only a couple of places where there was no equivalent location, and that is space. It made a lot of sense to use StageCraft for the NOMAD’s Biosphere environment and the Air Lock, where either the scope is so large that it would be cost-prohibitive to build a physical set, or the aesthetic goals would push you into doing a full bluescreen shot.”
A few smaller scenes were shot on an adjacent Pinewood stage equipped for traditional bluescreen or greenscreen, but as Roberts points out, the crew took the chance to innovate some distinct techniques. We had a scale portion of the missile that Joshua climbs on,” he explains. “We created interactive lighting for that by taking portions of the real-time NOMAD model from Gareth’s virtual production scouts, and animated them to enable these mechanisms pushing missiles into place. I had this little animated sequence, which I then rendered out as a black-and-white texture that had different layers of structure moving past, which imagined that the sun was out in space and these things were casting shadows. We connected that from my laptop to a 12K projector that was mounted on the set. So when John David is hanging on the exterior of the missile, we have real light interacting with him in the close-ups. That evolved quite organically.”
Altogether, 2022’s principal photography lasted some 80 days, not including an additional round of element shoots and pick-ups led by Edwards with an even smaller crew across multiple Asian countries.
A Global Collaboration
ILM’s studios in San Francisco, London, and Sydney would each make significant contributions to The Creator, with additional support from the Vancouver studio and an array of vendors. In October of 2022, Edwards came to ILM San Francisco to screen a three-hour cut of the film. “Everyone came out recognizing that it was something different and special,” recalls London-based visual effects supervisor Charmaine Chan. “It was a lot more than we thought it was going to be. Originally, we estimated around 700 or 800 shots. Watching that cut, we knew there were more than twice as many. So the question was how to handle that and deliver on time, on budget, and at the quality we always want at ILM. We had to set guidelines with Gareth about how we’ll be able to get this film across the finish line, and he was very receptive to it.”
Cooper’s proposal was an unusual “three-strike system,” where Edwards would be given three opportunities across the life of a given shot to provide notes, allowing ILM to iterate with as much focus as possible on the key elements of that shot. “That’s the optimal structure to ensure that all the money goes into getting a clear direction for the shot,” Cooper notes. “We were probably only successful doing that about 70% of the time, but there were a healthy number of shots where, after we solved the questions about the simulants, for example, Gareth knew that if we kept to those standards, we wouldn’t be chasing really small details.”
The design evolution for the simulant head mechanics resulted in an elegant approach that felt almost human. Building on techniques first employed by ILM for The Irishman, the team was able to seamlessly blend the movements of the actor’s skin with the rigidity of the rear components. “We’re trying to empathize with these simulants and understand what they’re going through,” explains Chan. “When you first see one, it’s just another human being, then they turn to profile and you realize it’s something else. Because the performances are so good, whether it be Madeleine or Ken Watanabe [Harun], you’re focused on them and feeling their emotions and you forget about all that gear.”
Many subtleties were incorporated into the headgear to complement the performances, including character-specific details, such as the battle-worn tech of Harun’s components. The animation team were responsible for creating tiers of almost subliminal movements that reflected each simulant’s emotional state. “When Alphie stops the bomb robot, for example, that’s full pelt as the mechanics whir up, which includes wonderful sound effects,” explains Ian Comley, also a London-based visual effects supervisor. “For everything else, it’s a kind of Swiss watch, cogs and gears ticking, something always active, but in a more gentle way.”
Throughout post, the ILM crew enjoyed an extraordinary level of direct access to the director and production designer. “It can be pretty rare to feel like you’re a core member of the filmmaking team,” says Comley. “I can’t think of another film where the production designer has stayed on until the very last shot. To build every robot, simulant, vehicle, prop, and structure required a lot of design. Gareth knew what he wanted and had a great relationship with James, and we were a part of that. We took initial concepts from James, tried to riff off them, and fleshed them out into assets. We could then share back directly with James, who could go in and do paintovers. With direct access, there’s no diffusion of ideas. Instead, it’s collaborative filmmaking.”
To create the full-body A.I. characters as replacements for select live actors, ILM developed seven distinct robot designs following Edwards and Clynes’ visual methodology, which combined a 1980s technology aesthetic with organic, natural influences. Each design could be made unique with specific flairs, often informed by the individual character, such as with Amar Chadha Pata’s performance as Satra.
“Amar is very expressive in his face,” explains Chan. “When he’s talking or thinking, his eyes and eyebrows say a lot. We captured that in the actor, but how do we present that in a robot? [Animation supervisor] Chris Potter was brilliant in suggesting all of these fine details, like in the eyes, which are very tiny on Satra. You can see a slight pupil and see eye darts when he’s thinking. The mouth was also slightly hinged, so all these little characteristics of Amar’s performance can come into this robot to show his emotions.”
Comely points out the “masterstroke” of Edwards’ decision to not decide on the robot characters while filming. “Even when it came to background characters, a typical film would decide who would be a robot and kit them up in mo-cap pajamas,” he explains. “None of that on this show. Gareth got naturalistic performances because people were just moving as people. If it was a scary scene, they acted scared with those fluid motions. No one had been told they were a robot and then acted twitchy or jittered, the kinds of things you might do.
“It also gave ILM license to switch out anyone,” Comley continues. “If he had picked someone onset to be the robot, it might turn out that that person isn’t located where your eye goes in the shot. The real person you want to be a robot is on the other side. We had the freedom to do that, which was a real challenge, but we could decide with Gareth after the fact which ones to choose. As shots changed, we could keep adjusting. The matchmove and paint teams did a fantastic job. The performances were so grounded, and we did very little to change that. The last thing Gareth wanted was for us to take a brilliant natural performance and turn it into a stereotypical robot. It was mostly heads and arms. There are instances with full-body robots, but by and large, they were additions instead of replacements.”
The London crew under Chan and Comley’s supervision spent considerable time on act three aboard the NOMAD, where the key challenge was to create fully-CG assets and environments that felt akin to Edwards’ naturalistic shots on real-world locations. On a typical show, ILM often incorporates grain or lens flares to match the source photography, but for The Creator, they also helped influence creative decisions to help bridge the divide between Earth and space, including moving the NOMAD into lower orbit where more diverse colors and atmospheric elements could be incorporated. “It helped marry the story points where people on the ground are able to see NOMAD above,” as Comley notes.
Even in a traditional CG scenario, ILM found ways to empower Edwards’ freeform shooting style. “Besides the real-time rendering and LED walls, the StageCraft suite also includes virtual cam sessions,” explains Chan. “The whole exterior of NOMAD was pure CG, so Gareth was able to hold an iPad and look around to see the different sections of the ship and frame his shots, from the wings to the central section that we called the ‘bunny teeth.’ We saved so much time with Gareth being able to do that, rather than having us propose specific framing ideas. With Gareth being a visual effects artist, he just grabbed it and started making choices.”
At times, Edwards even embraced the most ordinary of methods to convey his vision. For the sequence when Joshua attempts to climb onto one of NOMAD’s towering missile silos, the director provided reference footage by “taking a wastepaper bin with a water bottle inside for the missile and a little LEGO figure taped on,” as Comley explains. “He shot it all with his iPhone. It had the same principles of photography that he’d applied in the v-cam. You have to feel like there is an operator discovering the events as they unfold. Gareth’s philosophy was often to think that the operator was hanging out of a fast-moving plane because the NOMAD is so big, that’s the only way you could do it.”
By the spring of 2023, ILM had completed some 1,700 shots for The Creator (a handful of which came from Edwards’ original test reel). “We made some good choices in terms of how to build this whole train set,” explains Cooper. “Maybe the most important one was that James Clyne had a concept team all through post-production. In visual effects, it can get expensive area is when you don’t know what you want, and you iterate multiple times and change directions. Normally there’s a bunch of concept art and you spend your time chasing that. We had existing concepts, but once the movie was shot, James kept reinterpreting it. When we’d land on an idea, we already knew the shot, the camera work, and we could deploy our resources accordingly. Sometimes it’s a 3D asset that we build because it’s going to be in 40 shots. Other times we take the art model from James’ team, put it into the shot, they paint on top of it, put it back in the shot once more, and it’s done. Not standard procedure at all. It’s all about looking for those opportunities.”
Looking for the Next Challenge
The Creator’s unconventional production methods were successful not only in terms of the efficiency of its budget and resources, but in the ability of the artists on every level to make genuine contributions to the story. That came from Edwards’ example and leadership. “Everyone wanted to be on this project to the point where someone would roll off the show and keep asking if they could do one more thing on a shot, just to make it a little better,” says Chan. “Sometimes you can feel like a cog in a machine, just pushing buttons, but this was the opposite. Everyone on every level felt that they could be creative and suggest ideas.”
ILM was established to create solutions that respect the integrity of a filmmaker’s original vision. For an artist like Comley, the willingness of the filmmaker to include ILM in that visionary process is much more important than the actual problems to be solved. “One way or another, we can paint out that thing, track that thing, come up with a creative solution,” he notes. “Throw us anything you have. I’d rather have that and the vision and richness of photography than a clinical greenscreen and a question mark.”
It was a refreshing experience for everyone, but one critically dependent on the filmmaker. “You have to be willing and able to take this gamble, and it’s hard to find both things together,” says Cooper. “There are a lot of filmmakers that are willing but because of the studio constraints around them, they’re not able. And there are others who have the money and space to do it, but don’t necessarily have the amount of knowledge required. So if you can clone Gareth, you’re in a great place! [laughs] I think there will be opportunities to work like this again. Filmmakers will come to us and say, ‘I know what my movie is, I have so many dollars, and we don’t have to hit everything that I want, but I want to hit as many as I can – can we work together?’ As a company, we’d respond well to that.”
—
Lucas O. Seastrom is a writer and historian at Lucasfilm.
Earlier today The Academy announced their Oscar nominations for the 96th Academy Awards and we are excited to share that ILM teams contributed to 4 of the 5 films recognized. The films include The Creator, Mission: Impossible Dead Reckoning Part One,Napoleon, and Guardians of the Galaxy, Vol. 3.
Jay Cooper, Visual Effects Supervisor for The Creator said, “From the very start, we knew making this film would require a leap of faith. The Creator was a passion project for all involved and the craft and artistry that went into it from our fearless leader, Gareth Edwards to production designer James Clyne, our DPs Greig Fraser and Oren Soffer, and of course our amazing global visual effects team really shows. I’m thrilled that our team’s work has been recognized with this nomination and I’d like to thank the visual effects branch of The Academy for this incredible honor. I couldn’t be more proud.”
ILM Visual Effects Supervisor Simone Coco, a double nominee being recognized for Mission: Impossible and Napoleon, noted, “I am overwhelmed with emotion at this incredible honor. To be nominated for best VFX at the prestigious Oscars is a dream come true and I would like to thank everyone who helped make this become a reality, to all the productions, artists, and technology crew at ILM. Congratulations to all the nominees and best of luck!”
ABC announced that “The Oscars®” will air live coast to coast on SUNDAY, MARCH 10, in a new earlier timeslot (7:00-10:30 p.m. EDT/4:00-7:30 p.m. PDT). A 30-minute pre-show will lead into the live show (6:30-7:00 p.m. EDT/3:30-4:00 p.m. PDT). The telecast will also be rebroadcast in the Pacific Time zone in primetime after the live presentation.
ABC announced that “The Oscars®” will air live coast to coast on SUNDAY, MARCH 10, in a new earlier timeslot (7:00-10:30 p.m. EDT/4:00-7:30 p.m. PDT). A 30-minute pre-show will lead into the live show (6:30-7:00 p.m. EDT/3:30-4:00 p.m. PDT), and immediately following, ABC will air an original episode of the Emmy® Award-winning comedy series “Abbott Elementary.” The telecast will also be rebroadcast in the Pacific Time zone in primetime after the live presentation.
We wish the best of luck to all of our nominees.
On February 23, 2024 The Academy of Motion Picture Arts and Sciences will recognize 16 technologies for their impact on filmmaking. Two technologies that ILM played a key role in helping to develop will be among those recognized.
SciTech Awards committee chair Barbara Ford Grant said “this year, we honor 16 technologies for their exceptional contributions to how we craft and enhance the movie experience, from the safe execution of on-set special effects to new levels of image presentation fidelity and immersive sound to open frameworks that enable artists to share their digital creations across different software and studios seamlessly.”
Former ILM engineers Christopher Horvath and Joe Ardent are being recognized alongside Lucas Miller and Steve LaVietes for the AlembicCachingand Interchange system. Alembic began as a collaborative effort between ILM and Sony Pictures Imageworks to solve the problem of algorithms for storing and retrieving baked, time-sampled data enable high-efficiency caching across the digital production pipeline and sharing of scenes between facilities. The two companies would open-source the project and interchange library in 2011. Since then, Alembic has seen widespread adoption by major software vendors and production studios.
ILM’s Dan Bailey joins Jeff Lait, and Nick Avramoussis for the continued evolution and expansion of the feature set of OpenVDB. Core engineering developments contributed by OpenVDB’s open-source community have led to its ongoing success as an enabling platform for representing and manipulating volumetric data for natural phenomena. These additions have helped solidify OpenVDB as an industry standard that drives continued innovation in visual effects.
Unlike other Academy Awards® to be presented this year, achievements receiving Scientific and Technical Awards need not have been developed and introduced during a specified period. Instead, the achievements must demonstrate a proven record of contributing significant value to the process of making motion pictures.
Before this announcement, 34 ILM technological achievements had been recognized with Scientific and Technical Achievement Awards. This latest recognition continues a legacy of technical innovation dating back to the mid-1970s.