The recipients for Skeleton Crew included production visual effects supervisor John Knoll, production visual effects producers Abbigail Keller and Pablo Molles, animation supervisor Shawn Kelly, visual effects producer Nicole Matteson, virtual production supervisor Christopher Balog, and visual effects supervisors Jeff Capogreco, Bobo Skipper, Andy Walker, Joseph Kasparian, and Eddie Pasquarello.
ILM’s team joined fellow Skeleton Crew recipients from across Lucasfilm, including wins for Outstanding Young Teen Series, Outstanding Editing for a Young Teen Live Action Program, and Outstanding Sound Mixing and Sound Editing for a Live Action Program.
Visual effects supervisors Charmaine Chan, Andrew Roberts, and Simone Coco share their experiences working together on the Oscar-nominated Jurassic World Rebirth.
By Amy Richau
Bringing dinosaurs to the screen forJurassic World Rebirth (2025) required a true team effort with multiple ILM visual effects supervisors collaborating in teams around the world. While some films only require one visual effects supervisor to see the production through from start to finish, other films are just – bigger. Backing up production visual effects supervisor David Vickery, who recently talked about his work on Rebirth with ILM, were multiple visual effects supervisors from ILM, as well as others from partner studios like Midas VFX and ILP.
Charmaine Chan, Andrew Roberts, and Simone Coco talked with ILM.com about wrangling a herd of dinosaurs (both familiar and new to audiences), coordinating their individual teams’ work into one cohesive film, and the pressure of working on such a legendary franchise.
(Credit: ILM & Universal).
Getting the call
Working on an installment in the Jurassic film series was a full-circle moment for Chan, Roberts, and Coco, who all pointed to the original Jurassic Park (1993) as a moment that kick-started their careers in visual effects.
In the 1990s, while still working in the games industry, Roberts attended a talk from Jurassic Park’s CG supervisor Stefen Fangmeier in London. Hearing Fangmeier break down the work on Jurassic helped Roberts make the connection from his current work to a potential future in visual effects. “Seeing the same techniques of modeling, animation, and compositing that we were using in the games industry was the initial spark,” Roberts tells ILM.com. “That was an inflection point for me, where I started to pursue working in TV and film. The movie, as well as understanding the work that went into it, completely changed my life and my career and was the reason that I started to pursue computer graphics.”
The scene that stood out to Coco the most from the original film involved the iconic Tyrannosaurus rex. “It was so real and scary,” he notes. “I remember the T. rex screaming in the rain and shaking the glass and everything in the car.” The realism of that scene inspired Coco to better understand how the visual effects were created in other scenes in the film, eventually leading to his work at ILM, starting with projects like Napoleon (2023) andMission: Impossible Dead Reckoning Part One (2023).
Chan was growing up in Hawaii, close to where Jurassic Park was filmed, when it was released. “I remember thinking, ‘Oh my god, dinosaurs could be there.’ I was a kid, and it just felt so real to me.” After joining ILM, Chan worked onStar Wars: The Last Jedi (2017),The Mandalorian (2019-23), andThe Creator (2023)before joining the Rebirth crew, where she attended a special ILM screening of Jurassic Park. “It still stands up,” notes Chan, “that sense of awe and amazement and seeing the dinosaurs for the first time. And for me, it’s about wanting to recreate that feeling.”
While Rebirth was Roberts’s first Jurassic project, he had recently worked closely with director Gareth Edwards on The Creator. But even with that experience, Rebirth provided a “pinch-me” moment for him. “It was a little daunting, just seeing the quality of work and the deep history that ILM has with this franchise,” remembers Roberts. “So it was daunting, but very exciting. And I was definitely up for the challenge.”
The original Jurassic Park from 1993 (Credit: ILM & Universal).
Supervisor 101
The role of a visual effects supervisor can vary from film to film. Chan describes the role as that of both a mediator and translator, as well as the person to whom crew members come to with questions. “You see the big picture of everything and have such a huge overview of what’s going on that you can basically connect the dots that are needed for each department and each person within your team,” says Chan.
Coco points to being on set as an important part of the journey to reaching this role. “You start to see how the set works and how things develop from script to bidding to how we’re going to shoot this once getting on set.”
“In some ways, we’re here to facilitate the visual direction,” notes Chan. “Whether that be from the director or from our production visual effects supervisor, we make sure everyone is on the same page of what that visual need is. A lot of it is just working with people on a daily basis, reviewing their work and seeing that everyone’s moving in the same direction.”
The large number of visual effects shots in Rebirth (over 1,200) required splitting up the work throughout production and postproduction. Pulling off that many shots required constant communication between multiple departments and the visual effects supervisors, the latter of whom kept their focus on being creative problem solvers.
(Credit: ILM & Universal).
Designing the Dinosaurs
Chan was the first of the supervisors to join Rebirth in April of 2024, after dinosaur development at ILM had already begun. Figuring out how the dinosaurs would look and move on screen was a challenge they embraced through to the very last shot of the film. “We were constantly trying to make them the scariest, coolest, most fun dinosaurs we could,” says Chan. “We wanted something different from the previous worlds that we’d seen, something that honored some of the original Jurassic Park dinosaurs. But also, Gareth gave his own twist and turn to the design of them.”
Roberts, who joined Rebirth’s team last September, notes the jump between seeing skeletons of a dinosaur in a museum to thinking about how the creature’s joints would move in different environments. Before joining the film, he rewatched previous Jurassic films to get “familiar with the quality of work in all of them, how some of the creatures moved, and conveying the sense of weight for some of the bigger creatures.”
Gareth Edwards was heavily involved throughout the process of deciding how the dinosaurs would look in the film. “I think at one point we had a two-hour live session with Gareth trying to figure out what the Mutadon was going to look like,” remembers Chan, where one of the team’s modelers would try putting different pieces of real dinosaurs onto a Mutadon sculpture to piece it together. “I think that was vital to the process of making sure that our dinosaurs, from their basic stance, without even being in a shot, could stand by themselves and look cool. Once they were at the state that both David and Gareth were happy with, we would place them into a shot.”
Finding real-world animal references for each dinosaur was a key part of making the movements of dinosaurs in Rebirth appear believable and anchored in reality. To create Dolores, the small Aquilops dinosaur that Isabella Delgado (Audrina Miranda) adopts as her pet, an ILM team, led by animation supervisor Delio Tramontozzi, used videos of themselves interacting with their own pet dogs and cats. “They would have multiple takes of the way their pets were responding to a laser light or picking them up in a way that allowed them to snuggle into the crook of an arm or drape over a shoulder,” says Roberts. The reference videos were submitted with animation of Dolores or other dinosaurs so Roberts and other team members could see how those real-life moments translated to animated shots in Rebirth.
As Vickery was usually the only effects supervisor on set, he made sure to communicate what he and Edwards were looking for as far as dinosaur movements and behavior in different scenes. For the scenes in the tunnels when the Mutadon dinosaur pursued several characters from the film, Vickery took on the role of a dinosaur squeezing into the tunnel and picking itself up after landing on the floor. “There’s a moment where it plants its hands on the floor, leans forward with real weight, and roars before charging,” remembers Roberts. “And for a lot of that, David or [animation supervisor] Steve [Aplin] would act out to really convey the emotion they wanted. I think we really benefited from that. We’re all very comfortable with each other and locked in and just really enthusiastic about getting that character into the creatures.”
For another scene near the beginning of the film where a hybrid dinosaur almost caresses a lab worker with its claw before killing him, an animator was filmed holding a water bottle, looking at it, sniffing it, giving it a quick touch, and then snatching it. Notes Roberts, “that was a wonderful, fun performance from our animators, where they were able to get a bit more emotion into the scene from their own performance, which then was applied to some of the hybrid creatures.”
(Credit: ILM & Universal).
Dividing and Conquering
Different ILM supervisors took lead roles for each major sequence in the film. Chan’s team took on many of the water-heavy sequences featuring the Mosasaurus and the Spinosaurus, as well as the team that developed the Distortus Rex. Coco worked on the Mutadon sequences in the market and the tunnels as well as the T-rex chase sequence on the river, while Roberts tackled the beginning and ending of the film, as well as the cliff sequence featuring the Quetzalcoatlus.
Coco noted that splitting up the work into sections was helpful to their teams, so animators or compositors could go to one supervisor to ask a question, instead of having to approach multiple people to get the information they needed. Daily communication between supervisors and their teams of artists was also key throughout the production, as the team involved hundreds of people working in London, San Francisco, Vancouver, and Mumbai.
“It was very important for us all to hear what Gareth’s feedback was,” says Chan. “Because some feedback given on one dinosaur would also apply to another dinosaur in another sequence. And even though we were different teams, it was vital for us to still be sharing information about how we approach winged creatures or creatures in water — there were a lot of tips and tricks that we shared with one another.”
A library of shared assets documenting the workflow, along with an internal website, allowed everyone to understand what visual effects setups were established and ready to use and what they would need to create from scratch. This was especially helpful to Roberts and Coco, who joined the production after Chan. “A big part is sharing the tools up front to be on the same page about how we’re going to tackle things,” notes Roberts. “And then we have a number of chat groups for supervisors, as well as weekly meetings for each sequence and discipline.” Coco adds, “It was good to see what Gareth was looking for in a shot, or what was important for him in a particular environment, so I could follow that line.”
In one case, Roberts referenced the texture and amount of light in the sky in a night sequence at a gas station that the ILM team in London had worked on. That helped him to prepare a night scene his team had coming up. “We inherited that established look as a mood board of London’s work, allowing our team to match it seamlessly from the start,” notes Roberts, “so that when our team came on, we could say ‘we’re matching that.’ This is something that Gareth has already established. He likes this language for night, so we didn’t have to rediscover or explore that too much. So, without ego, just sharing and following, taking London’s lead where they were ahead, and then we also presented some of our work when we were ahead, or when it was on us to sort of establish a look. Very open communication made it a success and made it feel like it’s one team doing all the work together.”
Chat groups would also give supervisors an easy way to ask each other questions about how they might solve similar problems, especially in sequences where there was a bit of overlap between supervisors. To help with the time difference between London and San Francisco, Roberts and his team started their day early to increase the time the two teams were actively working.Another vital piece of the ILM crew on Rebirth was the production team – visual effects producers and production managers – who made sure supervisor teams were properly staffed, flagged important deadlines, and blocked off time for teams who needed to develop a new technique or tool.
(Credit: ILM & Universal).
Putting it All Together
The challenges Jurassic World Rebirth presented for its visual effects supervisors were varied, ranging from dinosaurs interacting with simulated water, designing environments from multiple elements, and satisfying a director well-versed in visual effects.
Coco’s team tackled the effects-heavy, intense action sequence where the Delgado family is chased by a just-awakened T-rex. While the river in the film is on a tropical island near the equator, these scenes were filmed at a British Olympic river course. “The T-rex interacting with the water, the digitally simulated water, and the family. It was a big, big moment,” notes Coco. “I don’t think a couple of years ago we would be able to do it because of the turnaround time needed. We had amazing effects artists who turned around the simulated water effects in record time.”
The Quetzalcoatlus sequence, when Zora Bennett (Scarlet Johansson) and other members of her team climb down a cliff to retrieve a sample from an egg, had its own unique challenges – and not all dinosaur-related. The cliff and cave environment was put together from a mix of elements, including footage shot at the cave set at Shepperton Studios in England, footage shot at Jog Falls in India, and millions of gallons of digitally simulated water. Mixing footage shot on location, wider shots that were fully CG, and digital extensions on top of drone work became a bit of a puzzle for Roberts’s team to make into one coherent environment. Another important part of this process was getting the right balance, wherein the background isn’t pulling too much focus from the actors. “Even though it’s multiple elements and different sections, you want to create a continuous environment where the audience truly believes the actors are immersed in that backdrop.”
Other shots not involving dinosaurs also occasionally proved tricky to get Edward’s sign-off on, in part because of his knowledge and appreciation of visual effects.
“Gareth has such a particular eye for blue screens that he can tell when a shot is a blue screen shot,” says Chan, “and for him, it’s successful when he can’t tell it’s a blue screen shot. So we are constantly trying to blend in, change lighting, include more atmospheric lens details, just so many little details that most people, when you think of just green screen or blue screen shots, wouldn’t even consider. Because Gareth wanted to make sure it never felt like a blue screen shot.”
Landing on the right scale for the dinosaurs was also an ongoing process for the visual effects supervisors and Edwards. “We’ve created these dinosaurs at a certain height and size,” notes Chan. “We put them in the shots the way they should naturally be at that size and height. And Gareth would look at some shots and say, ‘No, it doesn’t feel big enough.’ So we played this constant game of make it bigger, make it bigger, okay, that’s too big.
“One thing that Gareth just absolutely excelled at is scale and suspense,” Chan continues. “He knows how to compose every shot and frame to give you that sense. So to him, it’s less about the continuity and making sure things physically and scientifically look correct. It’s more about what makes the audience sit and look at something and feel that suspense. And so we worked with our animation team through many, many iterations of trying to figure out compositionally, what is the scale that works best for these shots?”
After months of hard work from teams across the world, the final product came together for the film’s release in July of 2025, giving both audiences and Rebirth’s crew an adventure to remember. “I think, every person who worked on the movie, and everyone that I talked to, they always said it’s been a dream to work on it, because it is such an iconic movie,” says Coco. “And in many cases, they started in visual effects because of Jurassic, so they don’t do it just because of the work, but because they love it. And working on such a big and iconic movie, they put their heart into it.”
Director Gareth Edwards on location (Credit: Universal).
– Amy Richau is a freelance writer and editor with a background in film preservation. She’s the author of several pop culture reference books including Star Wars Timelines, LEGO Marvel Visual Dictionary, and Star Wars: The Phantom Menace: A Visual Archive. She is also the founder of the 365 Star Wars Women Project – that includes over 90 interviews with women who have worked on Star Wars productions. Find her on Bluesky or Instagram.
Teams from Andor, Sinners, and Avatar: Fire and Ash were recognized at the ceremony in Los Angeles.
The 24th annual awards ceremony for the Visual Effects Society was held on February 25 at the Beverly Hilton in Southern California, and teams from Industrial Light & Magic have earned three wins.
Avatar: Fire and Ashwon Outstanding Environment in a Photoreal Feature for the Bridgehead Industrial City. ILM’s winners included Gianluca Pizzaia, Steve Bevins, Dziga Kaiser, and Zsolt Máté.
(Credit: ILM & 20th Century Studios).
ILM’s John O’Connell, Falk Boje, Hasan Ilhan, and Kevin George won for Outstanding Environment in an Episodic, Commercial, Game Cinematic, or Real-time Project for the Senate District in the episode “Welcome to the Rebellion” from Andor Season 2.
(Credit: ILM & Lucasfilm).
The feature film Sinnerswas recognized for Outstanding Supporting Visual Effects in a Photoreal Feature, and ILM’s Nick Marshall joined fellow winners Michael Ralla, James Alexander, Espen Nordahl, and Donnie Dean.
(Credit: ILM & Warner Bros.)
Congratulations to all of our ILM VES Awards winners, as well as to our Lucasfilm colleagues, who also took home the win for Outstanding Special (Practical) Effects in a Photoreal Project for their work on Andor.
ILM artists Ian Milham and Shannon Thomas take us behind the scenes of in the second of a two-part story about the 2024 Summer Olympics in Paris and 2026 Winter Olympics in Milan.
After ILM successfully created a landmark mobile deployment of its StageCraft virtual production system for the 2024 Paris Olympics coverage, it was only natural to up the ante for the next round. As a continuing partnership with NBC, the 2026 Milan Cortina Winter Olympics in Italy presented a batch of creative challenges that introduced a new level of dynamic presentation to ILM’s imagery. This included everything from shooting with a real ice rink in the foreground to simulating continuous motion on the volume’s LED wall.
“It was like the band got back together with this one,” notes virtual art department (VAD) supervisor Shannon Thomas. “It was the same crew on the client side. This time the process was less educational in terms of how we interacted with the client or explained how best to use the LED volume. They now understood how it would work, so this was about executing the same kind of creative process but with all new sets. Right from the start, we were able to get down to the specifics of what they wanted to achieve.”
Milan Cortina: The Brief
As with Paris, ILM’s task was to create a series of LED volume loads that acted as backdrops for athletes from the United States Olympic team. Footage of the athletes striking poses and performing actions in each setting would be utilized for any number of promotional needs before and during the Games. NBC brought six ideas for locations to the ILM team, and together they brought new layers of complexity to the job of realizing each of them.
A majority of the settings featured natural, snowy landscapes of the Italian Dolomites, including a mountain top, a frozen pond in a small valley, a ski lodge that looks out across an alpine vista, and a “moving” chairlift up a mountainside. Additionally, two Milan locations incorporated local landmarks, the Piazza del Duomo with its namesake cathedral and lustrous Galleria, and the interior of Teatro alla Scala, Milan’s 18th century opera house, which included an ice rink on the stage.
Initial challenges included the need to accommodate a massive scale of mountain landscapes within the visible scope of an LED volume wall. There was also the active motion of the athletes, a key difference from Paris. Ice skaters and hockey players would actually be moving across the small ice rinks built as practical sets in front of the volume wall. Not only did ILM’s technology need to coexist alongside the ice, but the camera needed to track effectively with the talent’s movements while also maintaining the appropriate sense of depth with the background.
As with Paris, the stylistic brief was to create a sense of hyperrealism, with accentuated lighting and staging. But the natural settings required a new level of realism to feel believable, compared to the Paris sets which were more idealistic. ILM artist Giles Hancock joined the NBC team on a location visit to Italy in early 2025. “Giles captured a ton of photo and video reference plus LIDAR scans of the actual locations,” says ILM virtual production supervisor Ian Milham. “We could then use that as the basis for the content that we made for the wall. That involved someone staging a camera tripod in different spaces, and even riding a chairlift, as well as some data from a helicopter flight.”
Setting the Scene
Both ILM’s past experience and its established relationship with NBC empowered the team’s ability to begin planning from a very early stage. “We did quite a bit of animated previsualization,” says Milham. “The athletes were actually performing their sport, and there were concerns about space and safety, of course, but we also needed to make sure that we could follow and track them in a way that did justice to their real movements.”
Understanding the parameters for the shoot, what Milham calls “the edges of what’s possible,” ensures that the ILM team can deliver everything that could potentially be needed on the day. The client is likewise empowered to envision whatever they can within those set parameters, as well as plan the practical foreground elements. And, of course, there are times when both the client and ILM work to push the established boundaries and see how much more they can do.
For the digital backgrounds, individual settings required their own distinct finessing, in particular the Piazza del Duomo. “Lead hard surface modeler Masa Narita modeled all of the buildings for that entire city square from Giles’ scans, and texture artist Maria Cifuentes textured that entire set as it looked in real life in Milan,” Thomas explains, “and I then lit the real time set based on our digital sky select, and incorporated where the practical ice rink would have to live inside the volume. I then had the new challenge of determining where you would have to physically be, if you were in front of a camera in the real Piazza square, in order to also see the top of the Duomo’s tallest spire. We quickly realized that even with a wide focal length we needed to pivot in order to ensure we could see the whole building.”
Because the Duomo’s cathedral is over 600 years old and such an important national landmark, it was necessary to make sure that the building’s recognizable shape could be seen in its entirety. This resulted in a slight increase in the size of the LED volume from what we used for Paris. ILM’s R&D engineers and technical teams designed a virtual LED volume tool which allowed Thomas and his team to instantly add more rows or columns of panels, all adjustable in real time to ensure that the physical LED volume built on the day would capture the full height and beauty of the Duomo.
(Credit: ILM & NBC Sports).
For the opera house interior, Hancock’s photogrammetry data provided a useful foundation. Narita and Cifuentes again created a photoreal CG asset for the space, which Thomas and lead virtual production technical director Rey Reynolds then staged in appropriately gold-tinged warm light, including the illustrious digital chandelier. On the set, practical red curtains, some 30 feet tall, augmented the background. “They could open them like real stage curtains in camera to reveal our LED content,” Thomas explains, “so that it felt like you were on a real stage, and behind it was the virtual opera house, an ice skater on real ice, fake snow, and real movie lighting. It was so cool. It looked like magic. It’s a great example of using the tool exactly how it should be used, not forcing it.”
The ice itself was roughly 20 feet long by some 40 feet wide. A refrigeration unit was specially designed into the space. A fair amount of research and planning was involved in determining the necessary space required for the athletes to get up to speed and perform. “Hollywood is crazy in terms of what they can do onsite,” says Milham. “You have to keep the room cold, of course. It’s all internally cooled like a hockey arena. It’s logistically difficult in terms of how things are built. You have to install the wall first, then build the rink, then leave the rink for a couple days so it freezes.” Practical snow was also used on multiple sets, which, as Milham notes, “adds the magic of it falling onto people or the sense of depth as they travel through it.”
The most elaborate practical foreground set was for the patio deck at the ski lodge, which included an actual fire pit, furniture, stringlights, a surrounding fenceline, and even a small tree that needed to match with ILM’s CG counterparts. The effects team initially decided to create the background as a standard 2D matte element, but when the client planned more dynamic camera movements for the scene, ILM pivoted to an elaborate 3D environment. Senior VAD artist Nate Prop led its creation, which allowed the team to make specific changes to the mountain view as needed. And Thomas notes, “[VAD supervisor] Christy Page even added little cars driving in the town, so you could actually see little headlights moving out there. The lights flicker in the town as well.
“There’s enough of the practical elements to tie you into the background,” Thomas continues. “The way they shot it works really well with the real fire pit and cabin structure, fence, trees, etc. Though if you move the camera just a couple feet to the left or right, you see all of the structural wood boards of the practical set build, like it’s a high school play. But in-camera, the shot works, it looks like they’re really there.” Actor Scarlett Johansson and Olympian Lindsay Vonn would be among those to shoot on the lodge set.
(Credit: ILM & NBC Sports).
Adjusting the Frame Rate
A significant technical change for the Winter Olympics production was to capture footage at 48 frames per second (fps). The additional frame rate would allow the client to modify the imagery to varying speeds as needed, and in particular with athletes zipping about on the ice.
“The difficulty with that is when shooting with a StageCraft LED screen, you need very exact synchronization between your camera, the content, and the wall,” Milham explains. “So with a higher frame rate, that’s orders of magnitude harder because you have to sync to 1/48th of a second instead of 1/24th of a second. Your content needs to do all of its transformation in half the time and everything else with it.”
Without the proper synchronization, a number of issues can arise, including a flicker effect visible on the LED volume, as well as on the foreground subjects and elements because the wall acts as both a background and a light source. On the previous Summer Olympics production, the cameras ran at 48fps but the wall content projected at 24fps, which meant there were limitations to how the final footage could be adjusted after the fact. By request of NBC, Milan Cortina became ILM’s first all-48fps volume shoot.
ILM imaging supervisor J Schulte and principal engineer and architect Nick Rasmussen coordinated weeks of rigorous testing for the chosen Alexa 35 camera system. The results provided much greater flexibility, especially with the demands of the Winter Olympics settings in mind. “It’s important to have if you’re really flying the camera around, like a push-in or a big crane move,” says Milham, “and we did some gigantic crane moves on this project.”
As they had with the Paris shoot, ILM ran three separate renderers to allow for quick changes between scenes. But in this case, two projected to the LED wall at 48fps, while the third projected at 24fps for set-ups that involved audio capture. Interviews and related scenes with dialogue would not require dramatic adjustments in speed.
(Credit: ILM & NBC Sports).
Just Like the Old Days
Perhaps the most distinctive of the volume loads for the Winter Olympics involved a chairlift that appeared to be moving through the Dolomites. The scene was initially planned as a standard interview set-up, with three locked-off cameras positioned to cover two or three subjects in the chairlift. This would require a moving background on the LED wall that ran for an extended time.
“NBC didn’t want to cut, because they might intercut the footage with other material, and then cut back to it,” Thomas recalls. “They wanted to just roll and let the people talk and get comfortable. They might only use a piece of it three or four minutes in. That means you need to run a lot of footage, and it has to loop at about eight-minute intervals. When you think of a Star Wars movie or something like that, you might have around 2,000 shots that a team of hundreds executes over months or years. Most of those shots are a few seconds to maybe 20 seconds long. We had to render eight minutes of a straight looping footage in a fraction of the time with one or two artists. It was a ton of work and data to maintain.”
Nate Propp again took the lead, visualizing a system that placed two opposite rows of mountain landscape moving in parallel on either side of the talent, not unlike a conveyor belt. “It didn’t matter if the mountains weren’t the exact layout of the Dolomites. We used all of Giles’ scans to piece together a long track of digital mountains that felt like the Dolomites versus having this specific mountain peak perfectly line up to that one. Our earlier Paris Olympics set work aided us here, so as long as we captured the essence of the Dolomites, it worked,” Thomas notes. “The conversation with the talent was the real focus. So Nate laid it out and duplicated a mountain landscape like railroad tracks, that could loop, basically to infinity. We could then place the ski lift in with the camera and ride this imaginary track for as long as they wanted.”
It required a great deal of experimenting to then determine the best means to render such a lengthy scene at 4K without overloading the machines. Additionally, there were concerns about maintaining consistent lighting. “What can break that set-up is not necessarily the background, but rather the feeling that the people aren’t actually moving through it,” Milham says, “and that’s usually because the lighting on them isn’t changing enough.” Coordination with the on-set, practical lighting team allowed them to find the appropriate balance. The volume itself can be equipped to synchronize directly with a traditional lighting grid.
The chairlift set-up became even more elaborate when it was decided to actually move the formerly stationary cameras during the scene. This required ILM to create additional pieces of the landscape on the fly to fill in previously unseen gaps. “It’s all our system and our artists,” Milham notes, “so we’re able to do that at the last minute.” That included associate virtual production supervisor Brad Watkins, who partnered with Milham in directing the team on the stage.
The final results, as Thomas points out with a smile, “are the same as Alfred Hitchcock’s background of Mount Rushmore in North by Northwest. It’s the same magic trick they pulled off in 1959. But now you can move the camera and get parallax.”
The Art of Collaboration
During pre-production, the ILM crew demonstrated their various plans for the set-ups to NBC. They showed the chairlift scene last. “We had the initial presentation that showed what you’d actually see from each camera,” Thomas explains, “and then we showed what the footage was actually doing on the site, this incline into infinity. Ian is pitching it and explaining what can be accomplished. And one of the folks from NBC turns to his colleagues and says, ‘Guys, this is it! This is going to work!’ We were so happy that they liked it, in particular, because we knew how important this specific set was for their vision. “I’ve learned through the years that, with any client, you want to really listen in to what is important to them, and then hit that specific note to reinforce that you are a team working together to achieve a shared goal, this is what builds confidence with your client.”
Throughout these Olympics collaborations, the key for ILM has been an equal mix of flexibility and adaptability to meet the client’s needs. The continuous, energetic shooting style only further demonstrated the versatility of ILM Stagecraft, and likewise ILM’s ability to meet the needs of any client.
“There are ‘unlocks’ here in terms of what is possible with last-minute scenarios, or in-demand talent, in terms of pulling off an ambition that otherwise would not be possible,” Milham concludes. “It doesn’t have to be Star Wars. You can use this technology to make sets appear very, very fast, and to take advantage of a small window of time with talent, all without limitations, and we can do it anywhere.”
—
Lucas O. Seastrom is the editor of ILM.com and Skysound.com, as well as a contributing writer and historian for Lucasfilm.
ILM artists Ian Milham and Shannon Thomas take us behind the scenes of a breakthrough virtual production shoot in the first of a two-part story about the 2024 Summer Olympics and 2026 Winter Olympics.
By Lucas O. Seastrom
American viewers of the Milan Cortina Winter Olympics on NBC and Peacock have experienced a number of striking visual effects created by Industrial Light & Magic, whether they realize it or not.
Dynamic, promotional footage of American athletes in snowy Italian landscapes and on the ground in Milan are in fact all shot in front of an ILM StageCraft virtual production volume. The achievement is part of a continuing story of ILM’s work to broaden the applicability of its virtual production toolset. These latest Olympic games are in fact the second to be showcased in such a way. ILM also partnered with NBC for the earlier 2024 Summer Olympics in Paris, an undertaking that won multiple Emmy Awards.
(Credit: ILM & NBC Sports).
Paris: The Brief
The story of ILM and NBC’s collaboration for the Olympics actually begins in 2021 with a distinctly American sport: football. “As people were hearing about our virtual production work on The Mandalorian, we talked with lots of different groups and did some work with them, including with NBC Sports for Sunday Night Football,” recalls ILM virtual production supervisor Ian Milham.
With the need to capture singer Carrie Underwood performing in multiple environments and in quick succession, ILM deployed a version of its StageCraft volume, which provided greater flexibility than a traditional blue screen. It proved a meaningful exercise in developing a different kind of story for a client with different needs than a feature filmmaker. “The following year, NBC was exploring ways to level up their work,” Milham explains, “and they reached out and asked if these tools could be put to further use.”
What NBC proposed for the Summer Olympics in Paris was far more ambitious than the Sunday Night Football production. Dozens of athletes – from swimmers to gymnasts to javelin throwers – would be captured in multiple Paris locales at twilight: a street, a riverside, a rooftop, a fashion show-eqsue runway at the foot of the Eiffel Tower, and a virtual trip down the Seine River during the event’s opening ceremonies. The resulting footage would be adapted into short form clips used for promotional spots before and during the Games.
“NBC’s goal was to get a lot of footage in different contexts of all these athletes looking amazing in a world that is aesthetically heightened,” says Milham. “Along with that, the DP/director [Scott Duncan] wanted to continuously run the camera in order to keep things improvisational with the athletes. You have to shoot all the time and capture lots of different things. It’s not like a feature where you’re going to board and previs everything in advance. That was our biggest challenge to deliver on. The director wanted no rules in terms of flexibility with shooting and NBC wanted a large amount of usable footage.”
(Credit: ILM & NBC Sports).
A Different Kind of Volume
The Summer Olympics production was the debut of a new variation of ILM StageCraft. “We had invented this really cool thing that people wanted to use, but Star Wars was always using it,” says Milham. “So we needed to make another version that wasn’t limited to one place. It would be a huge advantage to bring StageCraft to the client. So we created a mobile system, which was deployed for the first time with the Paris Olympics.”
The volume itself can be adapted to any size, its “tiles” – LED panels – adjusted to the needed shape of a given set. Created by ILM’s virtual production team based in San Francisco, the entire infrastructure is built to move, “like a set-up for a rock concert,” as Milham puts it. For the Paris Olympics, a roughly 180-degree curved wall was constructed to a height of approximately 25 feet. This specific production involved the extensive use of foreground set pieces that needed to blend seamlessly with the virtual background, as did the elaborate practical lighting set-ups.
“StageCraft isn’t just one thing,” Milham adds. “It’s a lot of different tools and techniques. Sometimes we use a little of it or a lot of it, whatever is needed.”
(Credit: ILM & NBC Sports).
Putting the City of Lights on the Screen
Starting some nine months ahead of the actual shoot in November of 2023 was virtual art department (VAD) supervisor Shannon Thomas and a team of artists responsible for creating the settings visible on the volume’s tiles. A four-year ILM veteran with 20 years across the industry, Thomas brings experience from a number of different effects houses, including Rhythm & Hues and Weta FX.
He “came here for Star Wars,” as he puts it, reflecting on recent projects he contributed to like Star Wars: Skeleton Crew (2024-25) and The Mandalorian and Grogu (2026). “I came here to work on the volume and be involved in real-time virtual production, future-tech projects, and to get back into film work.” But Thomas admits with a smile that he is also a big Olympics fan and was happily surprised to join the team for Paris, it being his favorite city to boot.
NBC’s brief for the City of Lights was different from a usual feature film in that realistic accuracy was not essential. The ILM artists would not be required to match the layout or appearance of a specific location in Paris, but rather capture “the idea of Paris,” as Thomas notes. “That goes all the way down to what kind of chair we need to have in front of a café. If it feels like Paris, then we have it. It allowed us to work faster as well.”
“We’re not going with photo-realism,” Milham adds. “It’s in a style that’s more like a glamorous photo shoot, a larger than life situation.”
The team spent considerable time determining the best digital skies, ultimately landing on the right blend of pinks and blues during magic hour for each set. For the street setting, former senior VAD artist David Flamburis took ownership and, rather than evoke a specific neighborhood, they created a fictitious location full of Parisian charm, and with fantastical views of the Eiffel Tower. The iconic structure itself was also a subject of considerable study, in particular how best to light it. Existing Eiffel Tower assets from earlier ILM productions were useful for reference. Along the Seine River, ILM changed the water’s width and depth as their artistic needs demanded.
Initially developed with commercial real-time software, the environments were then ported to ILM’s proprietary Helios renderer for volume projection. New advancements allowed for enhanced rippling, refraction, and reflection in the river water, which was sometimes augmented on the live action set by practical techniques, including a small tub of water with shards of glass. It was all in service of what the team dubbed “hyper-realism.”
According to Thomas, the opening ceremony load was probably the most challenging to create. “Everyone knew this would involve various boats per each country’s team going down the Seine, which was a very cool idea. The big challenge was the crowds, which is always a tricky thing in real time. We had to figure out the logistics of how many boats, how many people, and those types of things. We had tight resources throughout the project so we had to work very closely together to determine how things needed to be depicted.
“Senior VAD artist Nate Propp came up with a very clever solution for this process here that allowed us to color coordinate the crowds, per country color in sections, as if they were fans per country peppered around the set,” Thomas continues. “The digital crowd also had controls for how much they would cheer, including waving flags, holding signs, etc. For the distance we’d see them from camera and we knew the trick would work.”
ILM created a roughly half-mile stretch of river that was necessarily fictitious in layout. To determine the best speed, Thomas actually contacted a Parisian Bateaux Mouches boat tour company to gather research. “I told them that my parents were planning this trip to Paris, and they wanted to go for a ride on the Seine, but they get really seasick,” he notes with a laugh. “How fast do they go? Is there a lot of motion? And the company wrote me back! About nine to 12 knots was the average speed. Then we could design the movement of the boats that way and it worked really well, as it’s always best to work from reality and adjust from there.”
(Credit: ILM & NBC Sports).
How to Shoot with Lots of People in Many Places Very Quickly
Compared to a typical day on the StageCraft volume set of The Mandalorian, ILM’s crew for the Summer Olympics had to capture roughly four times the amount of live footage. During a massive production that involved dozens of athletes moving between six different locations on the Universal Studios lot in Hollywood, ILM’s volume stage welcomed 58 individuals over a six-day shoot. Some athletes were only on hand for a matter of minutes, requiring an unprecedented level of flexibility to make quick changes. The ILM crew executed over 120 scene changes on the volume’s wall without any waiting time required.
“The on-set grips were the real heroes with all of those changes,” Milham notes. “They had to move physical sets in and out 120 times. The practical art department worked with us throughout that process.”
The key to ILM’s flexibility was dividing its rendering power into subgroups. Whereas a cinematic-scale production like Star Wars would utilize all of its rendering capacity into one volume load that would be utilized for hours at a time, the rapid pace of the Olympics shoot led the crew to devise a new solution. Three separate renderers, each with its ability to power the entire LED wall, were loaded with distinct settings. When the client requested a scene change, all the ILM team had to do was switch the feed over instantaneously.
“Scott Duncan is an amazing and inspiring person to work, with who films shows like Survivor,” Thomas says about the production’s combined director and cinematographer. “He’d make changes live on the set, and would just pick up the camera and want to shoot something. The stage team would have to keep up. It was a quick, iterative process, very freestyle, like indie filmmaking, which I love. They’d shoot and just keep the camera rolling. Where in a feature film you’re focusing on getting a whole take or a specific scene, in this case they’re looking for just a few seconds of something amazing that they can use to then stitch into their longer marketing narrative.”
(Credit: ILM & NBC Sports).
Real Time Revisions
Not only could ILM make rapid changes between entire set-ups, but they could even make live alterations with the CG background itself. When the lighting team incorporated the tub of water with shards of glass in the Seine River locale, the stylized, caustic light initially felt jarring, more like a swimming pool. So to help balance the effect, the ILM artists plussed the scene with additional lighting along the riverside.
“We added some bright white lights along the river in the background, just like the lights along the side of a pool,” says Milham. “It helped to fit the swimmer in the scene because you could imagine that one of those lights was right next to her. Do we really care that the Seine doesn’t have those lights? Not in this situation. We’re just trying to make it look awesome. That’s the artifice of it. It’s okay if it looks like a dream.”
Incorporating these details within the Helios environment would have taken a matter of minutes, and all while the scene was still loaded on the volume wall. “The moment is over if it takes hours, so we have to do it right there,” Milham adds. “I’ll be there on the radio calling in the changes and adjustments as the shoot is taking place.”
(Credit: ILM & NBC Sports).
A Special Guest Introduction
“I had gone up to the stage during the shoot, and everyone seemed really happy with how things were going,” remembers Thomas. “Then our producer, Shivani Jhaveri, just mentioned, ‘Oh, Steven was here yesterday and he loved it.’ I said, ‘Steven?’ and she’s like, ‘Spielberg!’ What?! Laughing, ‘How?’ It was an unexpected surprise to hear he was pleased with the work, what a blessing to have him involved.”
Not long before the shoot was set to begin. NBC arranged for director Steven Spielberg to film a special introduction for the Olympics on ILM’s volume set. The special moment required yet another new way of presenting a scene on the LED wall. Spielberg would walk on from the side, with the blank wall and its surroundings visible behind him, and then as he came to center stage, the Parisian riverside location would load.
“We had just shot The Fabelmans with him, and he understood the process, so I think he trusted that it would work well,” says Milham. “And because this clip involves Steven Spielberg, the ‘filmmaking’ of it all can be part of the story. So Steven began walking outside the volume, as if he were on a movie set, because he was, of course, and then we turned the environment on. It was a relatively unique use of the technology.”
Milham describes the required process as “relatively easy,” an extension of their existing multiload capacity. They simply closed the video feed for the riverside scene to make the wall appear blank, and then turned it on again at the right moment.
Spielberg himself likened the grand show that is the Olympic Games to a great story, something that felt close to home for the ILM team, as Shivani Jhaveri notes. “The theme that Spielberg talks about in his opening is so relevant to StageCraft,” she explains. “There’s a connection in that StageCraft is all about telling a story. It was all about telling the athlete’s stories, where they’ve come from, where they are now, and it was really special to see all that.”
“If something new is needed, we’ll invent it.”
The success of ILM’s work on the Paris Olympics project was thanks to a relatively small team, especially compared to a feature film or episodic series. Along with Milham, Thomas, and Jhaveri, some of the other leading crew members included lead virtual production technical director Rey Reynolds, CG supervisor Sam Wirch, capture supervisor Ted O’Brien, and lead operator Kelly Fan.
“One of the reasons that ILM has been around for 50 years is that we’re not married to the way things are,” says Milham. “If something new is needed, we’ll invent it. If something we’ve been doing forever needs to change, we’ll change it. We will adapt. Even in the short amount of time that this method of shooting has existed, we’ve completely transformed it. One of my favorite things has been working with all sorts of different filmmakers, storytellers, and clients who tell us, ‘That’s great, but it needs to do this…’ or ‘Have you ever thought about trying this?’ And we try it. That happens on every show.”
ILM’s Olympics story continues with the 2026 Winter Games in Milan, Italy. Watch ILM.com for a behind-the-scenes look at this production, which included brand-new innovations.
—
Lucas O. Seastrom is the editor of ILM.com and Skysound.com, as well as a contributing writer and historian for Lucasfilm.
ILM.com is showcasing artwork specially chosen by members of the ILM Art Department. In this installment of a continuing series, three artists from the San Francisco studio share insights about their work on the 2025 mixed reality playset from ILM and Lucasfilm, Star Wars: Beyond Victory.
Art Director Stephen Zavala
(Credit: ILM & Lucasfilm).
Beyond Victory required a hub where our characters would live and roam. That’s how the garage was born as an idea. It served as an HQ where the player could come back after starting and finishing every quest. Like all concept work, one needs to find their footing, and this is done by providing several ideas for a particular story beat or design need. Challenges arise once we see the space in VR, since spaces have a tendency to look smaller or larger than we originally imagined. Once we see the space on a virtual space it’s all about adjusting the space to a scale we’re comfortable with.
The director, Jose Perez III, really wanted a place in the middle of nowhere. I tried to capture that, but it was also important to make sure it didn’t look abandoned. It’s isolated, but lots of activity happens inside the garage as well as the surrounding areas. I always liked the idea that it was a hub where all kinds of visitors would come and go, either to fuel up or repair their speeders, and bringing with them all kinds of cool stories.
I wanted to design a place with a sense that it’s been lived in for quite some time. It wasn’t meant to be in disrepair, but instead have that sense of daily life and how it can be messier than we’d like to admit when it comes to managing our spaces. It certainly was satisfying how the garage slowly grew into that exact idea.
All art pieces come with challenges. When in doubt you reference, step back or subject your designs to peer review. A brief pause often provides time for introspection on how to adjust the course.
Senior Concept Artist Casey Straka
Volo is our main/player character, and there were a few different physical traits they had to have, storywise; mainly, they had to be on the smaller side for podracing prowess, and have four arms for some specific game mechanics they wanted to incorporate. Besides that, it was a very open brief.
We considered multiple different species for Volo at first, lots of mulling over, lots of options that didn’t feel quite right. I proposed a Nikto, since they are such a varied species in the galaxy and have different subspecies and evolutionary traits depending on where they’re from. Maybe one subspecies evolved an extra pair of arms, which was a trait we needed. We landed there, but I don’t think we stayed there in the end, so I think Volo is something new altogether. I mostly took inspiration from previous Star Wars heroes! I wanted Volo to be very appealing, like you’d want to be their friend after you spend enough time following their story. I did a lot of additional drawings of Volo to find their mannerisms and expressions, the little things that make them, them.
Volo’s outfit was fun to do, I love a good Star Wars jacket. They also have flexible spines on their head, and how those move according to emotions ended up being inspired by cats; they flatten to their head when scared, flare out when angry, droop a little when sad.
In a very technical sense, a goal I set for myself was to hit a new benchmark in terms of skill; I learned a lot about my own process on this project. But one goal I always set for myself is to make a character people can get attached to. That’s always the most important part to me. I think all designs have their struggle points, some more than others. If something isn’t budging I try to take a walk. Concept art is a lot of problem solving, and getting distance from it to work something out can help.
Senior Concept Artist Evan Whitefield
(Credit: ILM & Lucasfilm).
The design draws inspiration from components of several different TIE fighters. Both engines are based on the TIE Bomber’s twin ion engine thrusters (ordnance pods), with TIE Interceptor wings mounted on each engine to give the vehicle a more aggressive silhouette.
The cockpit functions as the control pod and was cut down and reworked to feel more dangerous, almost chariot-like in form. Additional elements, including the energy binder plates, rear thrusters, steel control cables, and air intakes, were carefully integrated to create a seamless fusion of podracer and TIE fighter design language.
This vehicle wasn’t originally planned. It emerged naturally as a concept I thought would be fun to play in-game. Early on, I imagined the original owner as a former Imperial who went rogue and turned to podracing, scavenging parts from Imperial fighters to construct what became known as the TIE Bolt. As the concept evolved, the final story became that the TIE Bolt was a custom podracer created as a gift for Imperial Admiral Rellen by Grakkus Jahibaki Tingi.
ILM’s innovative approach leads the way for more than 1,700 visual effects shots, helping bring Wright’s dystopian action thriller to life.
By Clayton Sandell
(Credit: ILM & Paramount).
When director Edgar Wright was gearing up to make The Running Man (2025) and considered the extensive visual effects the story would require, he turned to a fellow filmmaker for advice.
“I’m friends with Gareth Edwards, and I was really taken with the work on The Creator,” Wright says in an interview with ILM.com. “Especially the idea of shooting on location and then designing the environments after the fact. I was really impressed by how the visual effects work was put into more naturalistic, grounded camerawork. I wanted to pick his brain about how exactly it was done.”
In The Running Man, a science-fiction thriller set in a near-future dystopian America, blue-collar worker Ben Richards (Glen Powell) desperately needs money to buy medicine for his baby daughter. He signs up with a TV network, hoping to compete on one of their game shows. He is picked for the most dangerous one: where contestants try to evade capture for 30 days in exchange for $1 billion.
After chatting with Edwards, Wright decided The Running Man should utilize the same unconventional approach that ILM brought to The Creator, winner of multiple awards for best visual effects, including from the Visual Effects Society.
Shooting on The Running Man began in early November 2024. With a release date rapidly approaching just a year later, the pressure to meet deadlines was on every department, including visual effects. Wright says he was happy the project reunited him with Academy Award-winning production visual effects supervisor Andrew Whitehurst. The two worked together on Wright’s 2010 film Scott Pilgrim vs. the World. “I remembered very fondly working with Andrew, so that was just an amazing, fortuitous bit of kismet,” Wright says.
The filmmaker credits Whitehurst and visual effects producer Sona Pak for preproduction planning, which kept everything on track. “Andrew and Sona were very clear on how to make this work, and how it would even be possible to turn around something this quickly with so many visual effects shots,” Wright recalls. “They had a very clear idea of what we were trying to achieve before we started shooting. What was really good was making decisions early on and sticking to them. I think where things can go awry – especially on a compressed schedule – is if you’re still working out what you want to do after you’ve finished filming.
“I’m frankly really amazed that we managed to do everything we did in time,” Wright adds.
(Credit: ILM & Paramount).
Ben Richards’ 30-day fight to survive begins in Co-Op City, with the journey taking him to New York and Boston. Exterior scenes were shot mostly in real locations in London, England and Glasgow, Scotland, as well as on practical and backlot locations in Sofia, Bulgaria.
“Most of the initial meetings and discussions were centered around the places we were thinking of shooting, and the things we were thinking of building,” Whitehurst explains. “When that started to solidify, it became much clearer who was actually going to do what, and what was physically buildable, and what wasn’t.”
Fans of Andor (2022-25) may notice that the Canary Wharf section of London makes an appearance in The Running Man, disguised by a number of digital enhancements. “We did have nicely filmed places where the majority was real, which is always a great starting point,” says ILM visual effects supervisor Dave Zaretti. “Then you’re extending upwards into the distance. You can change the sky a little bit, but it was based on truth and reality, and nicely chosen locations. The team had a blast.”
Another shot set outside the fictional network headquarters begins at the real entrance steps leading to Wembley Stadium, but then transitions to a completely CG skyscraper as the camera tilts up. “It’s very funny to me to take one of London’s most famous landmarks and digitally erase it from the movie,” Wright laughs. “That’s an example where we’re starting with a real shot of Glen Powell and all the extras walking up the steps, and then the camera just keeps going and going. That was really the methodology throughout. It was about keeping it grounded, because the perspective of the story is that you’re very much seeing it from Ben Richards’s viewpoint.”
Visual effects contributed significant digital building extensions, crowds, street signs, lampposts, traffic lights, and even flying mailboxes. Cars from the 1980s era were digitally augmented with designs that more closely fit the story’s futuristic aesthetic. “James Mohan and Ashley Pay deserve huge credit for taking on the lion’s share of world-building, from city extensions to augmented traffic lights, road markings, and uptown car augmentation,” Zaretti says.
For a rooftop sequence where Richards tries to escape from a Boston hotel, full CG city recreations were combined with live-action footage shot on a partial set against a green screen. Another scene that appears to be a single take is actually three, completed with digital seams. In Boston, Richards runs out of an apartment and down a hallway – dodging gunfire and heroically sliding into an elevator – before reappearing to smash the lens of a pursuing rover camera.
“That was three separate takes that we had to marry together,” says Whitehurst, revealing that Powell appears in the first and last parts of the shot, while a stunt performer completed the floor slide in the middle. “That stuff is pure invisible effects. You need to get them all into position and use CG where you need to. We had a CG digi-double take over between the different poses that weren’t quite matching across the takes. It was a fun shot.
(Credit: ILM & Paramount).
Digital doubles and extensive face replacements were used during chases and a pivotal moment where Richards narrowly escapes an explosive head-on collision, plunging from a bridge into the river below. The film’s finale features a completely CG V-Wing airplane, digital explosions, and spherical roving cameras capturing the action.
“Most of the time they’re hanging around like vultures,” Wright says, “and in some sequences there are three of them buzzing around. And in those cases, we had to constantly work out the choreography of where they were.”
The roving cameras provide live coverage for the audience watching the show on TV. But they also presented a visual effects challenge whenever a rover-eye view was simultaneously displayed on an in-world monitor. To maintain continuity, artists had to make sure that the angle seen in the rover’s video feed properly matched its constantly changing position.
“Steve Hardy deserves a medal,” notes Dave Zaretti, “for not only looking after the big exterior shots of the V-Wing, but also the hundreds of shots inside of it – keeping track of which rover cams should be seen where, not only in the main plates but also in the TV inserts.”
All of it adds up to a film jam-packed with tons of action and more than 1,700 visual effects shots.
“The effects work is huge, and subtle at the same time,” says Wright. “There’s a shot where Glen is in a New York hotel and gets into an elevator, and in the background is Times Square. But because the focus is on Glen the entire time, this amazing futuristic Times Square vista, with all of the screens, is completely out of focus. It’s a show where I feel there’s an enormous amount of work in the background, but out of focus. I think it’s really cool.”
A short sequence depicting Richards saving the life of a fellow oil-rig worker is only four shots, but is described by network executive Dan Killian (Josh Brolin) as “the most thrilling 10 seconds of video I’ve seen all year.” “We were very much beholden to deliver the most exciting 10 seconds of footage,” Whitehurst quips. “No half measures.”
The oil rig, crashing waves, lightning, and rain were fully digital elements. The actors were shot against a green screen. “We had two very dry actors dangling from a string,” adds Zaretti. “So we had to try and integrate them into the scene. But I think those shots worked really well.”
Work on The Running Man was hubbed out of ILM’s London studio, with further contributions from ILM artists based in Mumbai. Rodeo FX and Untold Studios completed additional shots. The key to a great end result, Wright believes, is all departments working together in sync.
“There’s incredible work by Andrew and ILM in the movie,” Wright says. “But it’s always in conjunction with something else – whether it’s the camera, an amazing location, what production design has done, what physical effects are doing. And the thing I’m really proud of in the movie is that all of this is people working together out of mutual respect.
“There are very few entirely green screen shots,” continues Wright. “And I think what people misunderstand about great visual effects is, they say, ‘It’s all CG.’ But of course, the best work is where it’s actually a collaboration.”
Whitehurst and Zaretti believe Wright’s style and approach to directing help bring the best ideas to life. “There was creative wiggle room,” Zaretti says. “And that’s nice, because you don’t always have that creative breathing space. So enjoy it and let the artists shine.”
“Absolutely,” concurs Whitehurst. “Edgar is definitely somebody who is very open to being shown something he was not expecting. It’s great seeing his enthusiasm when we show him stuff for the first time, and seeing him relax and go, ‘Oh, it’s going to be okay.’”
Wright says he’s most impressed by the world-building in the film, full of details that may only appear for a few seconds but make a lasting impression on the audience. “I wonder whether we set a dangerous precedent for ourselves by actually delivering in under a year,” the director laughs. “I’m really, really proud of the work, and I think some of the shots are just exceptionally beautiful and rich and detailed. What I also like about it is, it doesn’t feel like a lot of the effects are grandstanding.”
At the end of the day, Whitehurst says he is continually impressed by the ILM team’s innovative spirit that brought The Running Man over the finish line.
“ILM is a very refreshing place to work because there is so much experience, but it’s always in the service of making beautiful pictures that help tell the story,” he says. “I’m agnostic about what technology we use. I just want to use the right pencil for the job. But ILM has all of the pencils, and more importantly, the people who know how to use all of those pencils.”
Clayton Sandell is a Star Wars author and enthusiast, Celebration stage host, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell), Bluesky (@claytonsandell.com), or X (@Clayton_Sandell).
One of the biggest days in American sports is equally renowned for its iconic commercial spots.
The artists from Industrial Light & Magic have contributed visual effects to two original commercials as part of the broadcast of Super Bowl LX, the celebrated championship game of America’s National Football League.
Lucasfilm’s newest feature film from the Star Wars galaxy, The Mandalorian and Grogu, is set to premiere on May 22, 2026, and the original spot directed by Jon Favreau, “A New Journey Begins,” provided audiences with a touching moment between the production’s namesakes. ILM’s contributions include bringing the icy world of Hoth, along with a group of tauntaun creatures, to the screen alongside the beloved characters.
For another spot, ILM returned to one of its most iconic visual effects achievements with Steven Spielberg’s Jurassic Park(1993). For the new commercial directed by Taika Waititi in partnership with Xfinity, ILM created a Tyrannosaurus rex, Dilophosaurus, and a herd of Gallimimus, all inspired by the company’s work on the classic film.