Marvel

50 Years | 500+ Film and TV credits | 135+ Awards

SINCE 1975

The ILM visual effects supervisor speaks on ILM’s contributions to the blockbuster film that brought Marvel’s First Family into the Marvel Cinematic Universe.

By Jay Stobie

(Credit: ILM & Marvel).

Marvel Studios’ The Fantastic Four: First Steps (2025) transports audiences to the Marvel Cinematic Universe’s Earth-828, where Reed Richards (Pedro Pascal), Sue Storm (Vanessa Kirby), Johnny Storm (Joseph Quinn), and Ben Grimm (Ebon Moss-Bachrach) must prevent Galactus (Ralph Ineson) and his herald Shalla-Bal (Julia Garner) from destroying their entire planet. Directed by Matt Shakman, whose acclaimed credits include helming episodes of the long-running comedy series It’s Always Sunny in Philadelphia (2005-Present) and the mystical Disney+ hit WandaVision (2021), The Fantastic Four leans into a retro-futuristic aesthetic that blends 1960s-inspired designs with out-of-this-world technologies.

With this innovative endeavor in mind, the filmmakers called upon Industrial Light & Magic and its accompanying half-century of visual effects expertise to help execute Shakman’s vision, with a particular focus on The Thing, Galactus, the climactic third act battle in New York City, and more. Daniele Bigi (Ready Player One [2018], Star Wars: The Rise of Skywalker [2019], Eternals [2021]), who served as the ILM visual effects supervisor on The Fantastic Four, sat down with ILM.com to discuss the company’s numerous contributions to the project, from devising a fresh approach for portraying The Thing’s rocky features to constructing Earth-828’s distinctive New York City skyline.

An ILM Overview

As the ILM visual effects supervisor on The Fantastic Four, Bigi spearheaded ILM’s involvement on the project from the company’s London studio, working closely with invaluable colleagues like ILM animation supervisor Kiel Figgins and ILM senior visual effects producer Claudia Lecaros. “In this case, ILM didn’t split the work between multiple ILM facilities, so my team ended up keeping all the asset and shot work in London. We were assigned the major task of handling the third act of the movie, which centered on the final battle between the Fantastic Four and Galactus,” Bigi tells ILM.com. “Although it’s divided into multiple sequences, the third act is a continuous narrative from Galactus’s arrival on Earth through the end of the film. It was a fascinating and important piece of work to deal with.”

ILM’s assignment included devising an innovative look for Ben Grimm’s iconic alter ego, The Thing. “We did all of the initial development with [production visual effects supervisor] Scott Stokdyk and [visual effects producer] Lisa Marra from Marvel, in collaboration with [head of visual development] Ryan Meinerding. Ryan provided us with the concept for The Thing, which is what we based our work on,” Bigi relays. As the leading vendor for The Thing, ILM developed the entire character and then distributed the asset to the film’s other visual effects vendors for their own sequences.

(Credit: ILM & Marvel).

“After the initial development of The Thing, we were assigned another prominent character to build. Since ILM had several shots in which Mister Fantastic stretched his body and used his ability in an extreme way during the final battle, ILM ended up leading the look development of Reed Richards, too,” Bigi explains. In January 2025, ILM’s success with these character creations prompted Matt Shakman to task Bigi’s team with crafting the Fantastic Four’s immense nemesis, Galactus.

“Another big component to ILM’s work was the development of New York City, which was an imaginary version of it based on Marvel concept art,” Bigi continues. “Roughly 90% of the New York City shots were done in computer graphics by ILM. It’s a 1960s futuristic New York, and while certain aspects appear exactly like our New York, there are many buildings and stylistic elements that reflect both 1960s and futuristic designs. A large section of the city, including Times Square, was ingested from Sony Pictures Imageworks, whom ILM collaborated closely to combine different city blocks into a unified layout with a matching style, color palette, and overall look.” Most of the city set-up was handled by environment supervisor Stacie Hawdon and CG supervisor Tobias Keip at ILM’s London studio. In total, Bigi estimates that ILM contributed between 350 and 380 shots to The Fantastic Four.

Thinking the Thing Through

“At ILM, we aimed to deliver on Matt Shakman’s vision by dramatically changing what had been done with The Thing in the past. We sought to create the most believable, realistic performance that would respect Jack Kirby’s original design, from the size of the rocks to the very specific rock formation of The Thing’s brow,” Bigi shares. Animating facial expressions for a character whose face is composed of rock proved to be a considerable challenge. “We explored different options, but I always wanted to keep the rocks as rigid as possible. If we started to squash and stretch them, The Thing would resemble what was done in the past with plastic material and foam prosthetics.”

(Credit: ILM & Marvel).

Leaning into The Thing’s bouldery frame, Bigi’s team created small, undefined gaps between the rocks. “Depending on the expression, we could move the rocks in these minuscule spaces. Additionally, we allowed the rocks to gently stretch in areas that were invisible to the camera, giving us larger gaps that let us keep the rest of the rocks completely rigid.” ILM employed another sophisticated technique for The Thing’s face and body, running an effects simulation on the rocks rather than dealing with geometric skinning. Bigi praises FX and creature technical director Maybrit Bulla, who used Houdini to create a custom setup to control the collision between the rocks. “We used our blend shape technology to move the underlying surface, but there are rocks on top of it that are actually colliding. They push each other and land in a natural position. In some shots, we had to guide the simulation in an artistic manner to avoid having rocks go into unwanted territory and seem weird or strange. The process is something new that we developed for this movie.”

Ebon Moss-Bachrach as The Thing (Credit: ILM & Marvel).

When it came to actor Ebon Moss-Bachrach’s performance capture for The Thing, ILM referenced the work-in-progress geometry data from Digital Domain (another effects vendor on the film). “The data was useful for the initial stages and the blocking animation, but when we started to go into the minutiae with Scott Stokdyk and Matt Shakman, we ultimately worked on our own system and reanimated the character for our final animation,” Bigi details, crediting CG supervisor Marco Carboni for developing a workflow to quickly ingest data from Digital Domain and transfer it to ILM’s proprietary facial rig.

Rules for Reed Richards

Alongside Shakman, ILM outlined clear guidelines for Reed Richards’s capabilities as Mister Fantastic. “Matt was keen to avoid creating what we called a ‘noodles’ or ‘spaghetti’ feeling. How we controlled the stretch was unique and based on Matt’s vision,” Bigi recalls. “Instead of developing the character for months and then realizing that it didn’t behave in the right way, I proposed exploring various 3D action poses with extreme body stretch from several angles. Matt was incredibly receptive to the notion of rendering these static frames before having a functional rig or muscle simulation for the animator to use.”

Setting rules for Mister Fantastic became essential to ILM’s process. “What can Reed do? Do we want to stretch the neck, or don’t we? We decided not to, so there’s not a single shot where you see the neck stretching a lot,” Bigi notes. “We established a rule that only Reed’s limbs would stretch, meaning his upper torso and shoulders would remain the same width as the actor’s. Another rule dealt with his bone structure. While stretching, his elbows and knees would be more defined, the idea being that the skin was getting thin and wrapping around the bone. This was all discussed with Matt and Scott and developed in the initial stage where we did our 3D maquette action poses.”

(Credit: ILM & Marvel).

Bigi took inspiration directly from Marvel’s comic books, as well. “Many comic book artists before us, in particular Alex Ross, maintained a very strong V-shape when portraying Reed’s upper body. So, in the ILM shots where Reed is stretching, we kept the lat muscles on his body fairly large, like an athlete or swimmer,” Bigi declares. “We also decided Reed would snap his limbs back to a natural pose relatively quickly. The thought was that it wasn’t easy for Reed to stretch, so he would only do so on important occasions. He doesn’t do it for fun, at least in this movie.”

While Reed’s arms and legs stretch extensively, Bigi points to another key decision ILM made when generating the look and feel of Mister Fantastic. “The stretch of his fingers is minimal, and the gloves you see are usually the normal size as established by the practical costume designer. The concept being that, unlike the fabric close to his body, the actual fabric of the gloves didn’t need to stretch at all.”

Seeing Sue Storm

As was the case with The Thing, ILM pursued a unique path to conveying Sue Storm’s abilities in the final battle. “Rather than relying on particle simulation, all of ILM’s Sue effects were based on optical elements,” Bigi reflects. “The Sue effects were meant to be analog, in a way. There are no effects simulations of any kind. Most of those shots were crafted by ourcompositing team, so it’s a 2D-based approach using references of how lenses naturally create refraction and color variation. You see that we enhanced and exaggerated the prismatic fringes that occur with specific types of lenses.

(Credit: ILM & Marvel).

“Although this route was simple in a technological sense, it was nevertheless quite effective visually, and blended well with the atmosphere of the movie,” Bigi concludes. “Going with the latest, state-of-the-art technology is not always the answer. In this case, it was the opposite. We wanted it to feel simple and analog, so we stayed with the real optical effects. It’s all about what the director wants and the feeling you wish to convey.”

Grappling with Galactus

Unlike the challenges that ILM tackled with The Thing’s rocky features, the surface of Galactus’s face resembled the actor to a much greater extent. “We were able to use Ralph Ineson’s performance through a normal blend shape technique for Galactus’s face. Matt wanted to infuse Galactus with a god-like aspect, so he had us downplay the realistic human aspect and micromovements of the actor’s face. We reduced the range of motion and kept the face a bit firmer,” Bigi states. “For the body, we received a scan of the beautifully-constructed costume, but at the end of the day, ILM replaced it with CG in all of our shots because of its need to appear metallic.”

(Credit: ILM & Marvel).

Representing Galactus’s true scale also came into play. “We determined a specific height for Galactus, so the camera had to conform to that size. There are several shots with plate photography, but the majority was done digitally, especially due to the interaction between Galactus and the city,” Bigi reports. “Galactus’s body had to be covered with thousands of tiny lights, which couldn’t be done realistically with prosthetics, and he’s so large that the amount of detail necessary to set the scale was tremendous. We scattered literally millions of tiny pipes, greeblies, and geometric objects to increase the sense of scale. At a distance, our Galactus was the same as the costume, yet it was much more elaborate in the extreme close-ups.”

(Credit: ILM & Marvel).

ILM held conversations with Matt Shakman and Scott Stokdyk about the bridge devices that serve as a centerpiece for the climactic conflict with Galactus. “We developed an effect that we called ‘bridge effects,’” Bigi notes. “The bridge is an amazing device that – spoiler alert – Reed conceived to transport Galactus to another location in space. Because of the 1960s style of the movie, we avoided a digital quality for the portal. We found references and simulated optical effects rather than calling upon inspiration from the digital world. It was a real brainstorm with Matt and Scott. All sorts of ideas, such as having Galactus’s body stream with particles inside the bridge effects, came up in our conversations with Matt.”

A “New” New York

In preparation for depicting Earth-828’s New York City, Bigi traveled to New York for a 10-day shoot with The Fantastic Four’s second unit. “It was an amazing experience,” Bigi beams. “Based on the previs, there were certain shots we knew would be CG, but we tried to film as much as possible. Before going to New York, I used a combination of Google Earth and other digital resources to virtually scout Manhattan and propose methods to capture it from specific locations in a thorough fashion. I spent days capturing 360 HDRI panoramic views, mostly along 42nd Street, that construct a library of texture and material references. At the same time, a small team from Clear Angle Studios scanned the entire road using a LiDAR [Light Detection and Ranging] scan.”

The work continued upon Bigi’s return to London. “Initially, we took the images of New York and removed all the buildings constructed after the 1960s. It was essentially a filter that permitted us to show this version of the city to Matt and Scott,” Bigi remembers. “Then, in collaboration with [production designer] Kasra Farahani and Scott, we drew inspiration from futuristic-looking buildings elsewhere in America, such as Chicago. We selected preexisting real-world buildings that had rounded shapes and concrete bases. Another selection was done by concept artists at Marvel who had come up with original designs.

(Credit: ILM & Marvel).

“My team at ILM modeled those buildings, and we set their number and location along the street. We built several layouts and versions, gradually shaping the features of the street. That aesthetic relied on the props, as well,” Bigi asserts. “The cars and billboards resemble those from the 1960s, and we scattered spherical water tanks around the city. The phone booths aren’t based on their 1960s counterparts, as they were designed specifically for the movie. From the skyscrapers down to minute details like the color of the phone booths, everything is either a combination of real 1960s references or the artistically-driven futuristic elements that are now synonymous with the film.

”The time and talent that ILM invested in The Fantastic Four has paid off for both the artists involved in the project and audiences around the globe. Upon seeing the final cut, Bigi gravitated towards one of ILM’s shots when ranking his top stand-out moments from the project, declaring, “There are several moments that I love, but for me, Galactus emerging from the water and entering Battery Park from the river is my favorite. The water simulation and the composition combine to create a wonderful shot to begin that sequence.” Applauding the work of compositing supervisor Juan Espigares Enríquez and his compositing team, Bigi concludes, “I think it’s one of The Fantastic Four’s most exciting and spectacular moments.”

(Credit: ILM & Marvel).

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Celebrating Ten Years of Immersive Entertainment at ILM

By Amy Richau

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and highlights from Industrial Light & Magic’s 50 years of innovative storytelling.

In immersive stories, the fan is the hero.

Industrial Light & Magic (ILM) has always been at the forefront of innovation, drawing audiences into new worlds by pushing technological boundaries. As part of our special series, “ILM Evolutions,” ILM.com talked with Vicki Dobbs Beck (vice president, immersive content innovation), Julie Peng (director of production), Tim Alexander (visual design director), Ben Snow (senior visual effects supervisor), and Shereif Fattouh (executive producer) about the past, present, and future of ILM’s immersive storytelling.

ILM’s LiveCGX team, including visual effects supervisor Mohen Leo (bottom).

A New Way to Tell Stories

“Let’s invite our fans to step inside our stories in ways that had never before been possible.”

 – Vicki Dobbs Beck

While ILMxLAB was formally established in 2015 to explore the possibilities of immersive storytelling, the seeds of this endeavor actually began much earlier for Vicki Dobbs Beck. In the 1990s, Beck worked at Lucasfilm Learning, where a certain prototype caught her attention. “It was called Paul Parkranger and the Mystery of the Disappearing Ducks,” Beck tells ILM.com. “And what was really cool about all of the projects we were doing at that time is they really did sit at the intersection of storytelling, interactivity, and high-fidelity media – such that it was back then – all through an educational lens.”

Fast-forward to Beck’s time at ILM as head of strategic planning, she started bringing together talent from ILM and LucasArts (now Lucasfilm Games). “It was kind of this little rebel unit that was doing some pioneering R&D [research and development] in high-fidelity, real-time graphics,” says Beck. “Their success gave us confidence that the foundation was in place to build an immersive storytelling studio, expanding on the R&D work done by teams like the Lucasfilm Advanced Development Group (ADG).”

In 2015, ILM and Lucasfilm announced the formation of ILM’s Experience Lab (ILMxLAB) – a new division that would combine the talents of Lucasfilm, ILM, and Skywalker Sound. Lynwen Brennan, then Lucasfilm executive vice president and ILM president, announced that “The combination of ILM, Skywalker Sound, and Lucasfilm’s story group is unique and that creative collaboration will lead to captivating immersive experiences in the Star Wars universe and beyond. ILMxLAB brings together an incredible group of creatives and technologists together to push the boundaries and explore new ways to tell stories. We have a long history of collaborating with the most visionary filmmakers and storytellers, and we look forward to continuing these partnerships in this exciting space.”

The Holocinema team.

From its inception, a studio telling immersive stories with emerging technology platforms made ILMxLAB an appealing destination. Julie Peng, who worked in Lucasfilm Animation as a production manager for projects like Star Wars: The Clone Wars (2008-13) and Strange Magic (2015), was looking to break into the emerging interactive storytelling space when she received a call about a new ILM division that would focus on technology like augmented and virtual reality. “When we started, we were five people,” remembers Peng. “I did my best to take care of any need that arose, big and small, from developing the production infrastructure to writing job descriptions, to ordering pizza and running to the store for batteries. It was about doing whatever was needed to build a team and start exploring what we could bring to the immersive entertainment space.”

In 2016, the studio debuted its first VR experience, Star Wars: Trials on Tatooine, where the Millennium Falcon lands in front of players, and they help R2-D2 and Han Solo with repairs. This was an important step in the studio’s goal of creating a living world. “Everybody was just so blown away by the scale,” says Beck, “because that’s something that VR is so good at – delivering scope and scale.”

Looking to the future was also always a part of the plan. “Because we were so early in the whole immersive storytelling space, we really wanted to help drive the industry,” says Beck. “So we actually very consciously shared our prototypes in public. We spoke about them. We made them available to people because we wanted to actively inspire others to create in this space alongside us.”

Over time, the team evolved into a mix of creatives from the film industry and people with backgrounds in games and interactive development. While bringing in developers with both backgrounds was essential, it also brought challenges for Peng in her role as production manager. “Early on, I realized that they spoke two different dialects,” says Peng. “They used similar terminology, but their approaches to making a creative product were quite different in terms of process and priorities. I found myself becoming a bridge, translating concepts and driving the development of a common language so we could all communicate effectively.”

The team also had to be comfortable with fluidity as the technology they were working with was constantly evolving. Peng noted staying abreast of what was going on in the industry was key, as well as the leadership team being willing to take some risks. “I always call it ‘holding hands and jumping off the cliff together’.”

The First Big Leaps

“It [VR] really is like stepping into a different world, and it feels totally natural once you’re there.”

– Julie Peng

Visual effects supervisor Tim Alexander became involved with ILMxLAB after a history in traditional visual effects, including the 2015 blockbuster Jurassic World. He was also a lifelong gamer intrigued by the work ADG was pioneering at the time: bringing real-time, game engine-type techniques into visual effects. When director Alejandro G. Iñárritu approached ILM about a collaboration on a virtual reality project, Alexander came aboard as visual effects supervisor. The result was CARNE y ARENA, which debuted in 2017.

Still early in ILMxLAB’s history, CARNE was an ambitious project involving a short VR piece bookended by physical experiential rooms that put the audience into a story of immigrants being detained while crossing the border from Mexico to the United States. At the beginning of the experience, participants are brought into a physical holding cell (where the temperature inside is cold) where they have to remove their shoes and items like backpacks. “There are ambient noises and real artifacts like abandoned shoes that have been found in the desert, from people crossing,” notes Alexander.

The key art for CARNE y ARENA.

Participants are fitted with a VR headset and led barefoot into a 50-foot-by-50-foot room full of sand. In the VR portion of the experience, they assume the role of a group of immigrants attempting to cross the U.S.-Mexico border at night when they are stopped by U.S. Border Patrol agents. After the VR story, participants exit and are led down a hallway where video monitors play interviews of the real people CARNE is based on. “He [Iñárritu] cast people that had crossed the border as the people within this experience and wove a story around that, so you actually see the real people and hear their experiences,” says Alexander.

CARNE was a challenge from an artistic and engineering standpoint. What Iñárritu and Alexander wanted to do was sometimes hindered by the current technology. Wanting the images to appear as photoreal as possible, the team realized the immersive film’s computing requirements outweighed what was possible in headsets at the time, so Lutz Latta, ADG graphics engineer, designed a supercomputer with four high-end GPUs (graphics processing unit) to handle work such as calculating shadows in the film.

Other challenges included allowing participants to traverse and turn around in a 50-foot-by-50-foot room. “At the time, there was no way to really run a VR headset over more than 100 feet. You were lucky if you could get five feet away because of the HDMI cables and all kinds of things,” remembers Alexander. VR tracking abilities at the time were also far below where ILM’s engineers wanted them to be. “So then we started mixing in stuff that we know from visual effects of how to track cameras in large spaces. A motion capture stage was built to track the headset instead of what we would usually track the camera in. So it started becoming a mixture of different things that we knew how to do for different reasons, and kind of applying it to this situation.”

A final frame from CARNE y ARENA.

The newness of the technology and the goals the team wanted to achieve with CARNE meant everyone had to adapt and be ready for anything. “It was the first project in my career that I was actually concerned that I would not be able to deliver,” says Peng. “Because in all of my past projects, I would have a production plan, A, B and C, D and E in my back pocket. We were working with new technology and making something that had never been created before. There was no model of how to do that, which made me feel like I was operating without a parachute. That can be very nerve-wracking but also exhilarating when you actually finish the project. That sense of completion and accomplishment was huge.”

Audience reactions to the very visceral experience were all over the map. “We had people that really wanted to get into the middle of it, and they would look at every character and perhaps even jump behind a virtual bush to hide, while others might hang back to observe the scene, whether due to fear or other emotions that came up,” says Alexander. “The overall sense that I got was that people really understood what Alejandro was trying to say,” notes Alexander. “They heard it, and they understood what he was trying to express through that story.”

The studio’s debut with The VOID, Star Wars: Secrets of the Empire, also took place in 2017. At The VOID, up to four fans would suit up with their gear: a VR headset connected to a backpack laptop and haptic vest. From there, teams of fans were immersed in ILM’s digital world; in this case, Secrets connected to Rogue One: A Star Wars Story (2016), giving fans an adventure of a lifetime on Mustafar near Vader’s castle. While infiltrating an Imperial base, they would traverse the facility together and try to recover a key artifact.

Dropping Into the Story

“If we’re creating an experience, we want people to feel like they’re genuinely in a Star Wars project.”

– Ben Snow

Vader Immortal: A Star Wars VR Series

While working on Secrets of the Empire, Ben Snow (visual effects supervisor for Star Wars: Attack of the Clones [2002] and Iron Man [2008]) was recruited to work on a connected story that was in development for home use, Vader Immortal. In the project’s early days, a prototype was put together to see what it was like to be in the same (virtual) space as Darth Vader – spoiler: it’s terrifying. At the same time, Oculus was quietly working on the first Quest headset, revolutionary with its tetherless execution, which ultimately influenced the amount of lightsaber play in the story. The stars and companies aligned, and Oculus Quest became the platform partner for release.

In Vader Immortal, fans take the role of an unnamed pilot who finds themself inside Darth Vader’s castle on Mustafar. The fan’s interactions with Vader were, of course, key to the success of the project. “Mustafar should be scary,” notes Snow. “The confrontation of meeting Vader should be scary. Because that’s what he is.” The Immortal team used scans of Vader’s costume from Rogue One (based on the original 1977’s Star Wars: A New Hope) and built on them with new scans to push the realism even further.

Concept art from Vader Immortal: A Star Wars VR Series by Russell Story.

The team worked internally with Lucasfilm to develop story ideas for three episodes. David S. Goyer, screenwriter of The Dark Knight Rises (2012), wrote scripts around them, injecting his own characters. These were similar to traditional film scripts, which then had to be made more interactive by adding lines of dialogue for the fan to perform certain tasks. The production brought together the film and games world as they put it all together. “In film visual effects, you get a script, you break it down. These are the assets you have to build,” explains Snow. “Interactive entertainment is much more free form and evolutionary. It was an interesting blend between those two mediums.”

The goal with Immortal was always the same: create an experience unique to virtual reality that you couldn’t experience by watching a movie. “One of the things that excited us,” says Snow, “was this was a chance to eavesdrop on Vader a little bit. We had the moment where Vader takes off his helmet, and he’s looking at a memory, almost, of Padmé. You’ve been climbing around, find yourself in Vader’s chamber, and you’re peering through these walls at him. We felt that moment of actually being an interloper, and seeing a side of the character you hadn’t seen before was something that was unique to what we could do in VR.”

Another element distinct to Immortal was making Vader the fan’s own teacher during the experience. The Sith Lord’s introduction is fittingly terrifying for such an iconic character. Initially, when Vader stepped up to the fan, he had a few lines of dialogue. But those lines were eventually cut after some internal tests of the experience. “Vader’s in the distance, and he comes toward you, and you hear the heavy breathing and footfalls,” says Beck, “and he keeps walking toward you, and it becomes more and more intimidating. Almost no one heard the dialogue because you’re so overwhelmed by his presence that it’s all that you can absorb.” Adding to the power of that moment was the eye-tracking in the experience, so no matter the height of the fan, Vader was looking right at you. “And the fact that you’re being acknowledged by a character like Vader is just mind-blowing,” adds Beck.

Actor Maya Rudolph (left) performs the voice ZOE-3 in Vader Immortal as director Ben Snow (middle) and writer and executive producer David S. Goyer look on.

What If…? – An Immersive Story

Shereif Fattouh came to ILM from an AAA games (high-budget, high-profile games from large studios) background, working on titles like Battlefield and Dead Space at Electronic Arts. Interested in story-driven projects, Fattouh worked on The VOID projects Ralph Breaks VR (2018) and Avengers: Damage Control (2019). The development of a new headset, the Apple Vision Pro, led to Fattouh’s involvement in What If…? – An Immersive Story (2024), an experience that uses both mixed reality and virtual reality in addition to hand and eye-tracking through Apple’s innovative headset technology.

Marvel Studios’ What If…? series gave the developers a great amount of freedom in one of the most popular story worlds on the planet – the Marvel Cinematic Universe. “What If…? is such a great vehicle from the comic books and then to the animated show, where you get to just play in a sandbox,” says Fattouh. “What if this happened, and it’s a completely different version of it, and that kind of creative freedom just allowed us to tell the story that we wanted.”

Similar to previous ILM projects,the What If…? team was working on a project without the tech they would need to bring the experience to fans, as the Apple Vision Pro was being created in parallel. “We started development really early on,” says Fattouh. “It was a great collaboration with Marvel Studios, Disney+, and Apple, but we were definitely doing early, early testing and kind of figuring it out as we went.”

A final frame from What If…? – An Immersive Story.

What If…? – An Immersive Story took about 18 months from the conception of this particular idea as an experience to arriving in fans’ hands. Getting there involved finding the balance between the fans watching the story unfold and directly engaging with the characters and environments. “It’s really subjective,” says Fattouh. “There’s no right answer. How much do we want the audience to really observe this amazing story that’s being told and being kind of talked at versus going in and doing things and impacting the narrative? So that was really one of the biggest challenges throughout the whole life cycle. Playtesting it and figuring out, ‘Okay, is it feeling right? Is this beat too long? Is it too short? Do we want to have people jump in and get into the action a little bit faster?’”

During What If…? – An Immersive Story, the Watcher enters the room where a fan is situated. Throughout the story, fans see and interact with versions of some of their favorite Marvel heroes and villains, including Wong, Thanos, Hela, and Wanda. Fans are active participants in the story and get to use iconic items from the Marvel universe, like the Time Stone, to move the story forward.

Fattouh also notes how What If…? gives fans a unique way to experience a familiar Marvel moment near the beginning of the experience. “You don’t really know what’s going on because it starts with a disembodied voice, and you’re in space,” says Fattouh, “and then we kind of kick off in a very Marvel way, where it has that iconic Marvel logo flip book entry. But we did a very 3D spatialized version, where it’s coming into your living room. Just getting to see the smile on people’s faces when they saw something they’ve seen a lot in the films, but to see it really coming out in your living room … it set the right tone of, ‘Oh, this is something different.’”

Marvel director Dave Bushore (center) confers with Immersive crew members during production of What If…?, including: Maya Ramsey, Patrick Conran, Marissa Martinez-Hoadley, Indira Guerrieri, and Joe Ching.

What the Future Holds

After ten years, the team remains small, retaining its nimbleness on a quest for innovative excellence. Working with multiple partner studios and collaborators, the immersive team staggers projects, with typically two in production at a time, and with a production timeline of between 12 to 24 months. “I think over time, our goal will be to expand that capacity and capability,” says Beck. “It might mean expanding it in other studio locations – maybe in London or in Vancouver. The size of the team we have is really nice because everybody knows each other. We can iterate together, and that’s a really important part of interactive, immersive experience development.”

The immersive team has high hopes looking to the future as the technology reaches a wider group of people. “Venues like Star Wars Celebration are always amazing,” says Peng, “because the technology is still growing, and it gives us a chance to share our stories directly with fans. It’s also rewarding to see the accessibility of our experiences making it feel entirely organic and inclusive for everyone.”

Beck looks forward to hands-free AR glasses that can deliver a high-fidelity image with a wide field of view. “We are very excited about this idea of storyliving at city or world scale,” says Beck. “Geo-located content where you could be out in the world in your glasses and little story moments would unfold in the real world.” Beck also sees more people who don’t consider themselves gamers gravitate towards immersive stories. “And I think that’s really great for us because we’re interested in that intersection of story and interactivity and putting you at the center of that experience.”

ILM’s team on What If…? won an Emmy for Outstanding Innovation In Emerging Media Programming. From left: Elizabeth Walker, Ian Bowie, Lutz Latta, Marvel’s Dave Bushore, Vicki Dobbs Beck, Mark Miller, My-Linh Le, Julie Peng, Pat Conran.

Looking ahead, the future of immersive stories is limited only by the imaginations of writers, designers, and engineers devoted to bringing these experiences to audiences. “I think that there’s a huge opportunity for ILM in immersive entertainment broadly defined,” notes Beck. “When we first started, the word “immersive” almost always meant virtual reality, then it included augmented reality, and eventually mixed reality. But now, it’s being used to include linear content or pre-rendered content, but that’s very immersive through screen technology, like Abba Voyage (2022), as an example. The opportunity is to take our talents across the global studios, which include the highest quality visuals and sound, and couple that with the real-time understanding and capability, bringing those things together. I think that we’re going to start to see an increasing desire for interaction, where you are actually in an experience, doing something meaningful that makes the overall experience even more personal. And beginning to understand what that is and taking steps toward a storyliving future. I think that’s the big opportunity for ILM.”

Currently in development in partnership with Meta Quest is Star Wars: Beyond Victory – A Mixed Reality Playset which takes players into the fast-paced, high stakes life of a podracer.

Read more “ILM Evolutions” stories here on ILM.com.


Amy Richau is a freelance writer and editor with a background in film preservation. She’s the author of several pop culture reference books including Star Wars Timelines, LEGO Marvel Visual Dictionary, and Star Wars: The Phantom Menace: A Visual Archive. She is also the founder of the 365 Star Wars Women Project – that includes over 90 interviews with women who have worked on Star Wars productions. Find her on Bluesky or Instagram.

The visual effects supervisor from ILM’s Vancouver studio shares insights about helping create new characters and bringing the streets of New York City to life.

By Mark Newbold

(Credit: ILM & Marvel).

Proudly displaying the most famous typographical symbol since George Lucas placed an acute accent over the “e” in Padmé Amidala, Thunderbolts* arrived in cinemas on May 2, 2025, to a fanfare of critical praise, bringing together a gaggle of questionably motivated heroes, including Florence Pugh as Yelena Belova, Sebastian Stan as Bucky Barnes, David Harbour as Alexei Shostakov, Wyatt Russell as John Walker, Hannah John-Kamen as Ava Starr, Lewis Pullman as Bob Reynolds, Olga Kurylenko as Antonia Dreykov, and Julia Louis-Dreyfus as Valentina Allegra de Fontaine. Director Jake Schreier (who also helmed Star Wars: Skeleton Crew’s fifth episode) led the effort to create an adventure that thrills, engages, delights, and amuses in equal measures.

Thunderbolts* is a story that not only details the rise of a motley crew of rogues into the heroes of Manhattan but also the war Bob Reynolds fights internally as he battles to free himself from his dark alter ego, Void, with the help of his newfound friends. Industrial Light & Magic’s visual effects supervisor Chad Wiebe (Captain America: Brave New World [2025], Obi-Wan Kenobi [2022], Thor: Ragnarok [2017]), joins ILM.com to discuss the challenges of not only bringing a tentpole release to the big screen but also creating striking new effects for fans of the Marvel Cinematic Universe (MCU).

“ILM’s work started back in May 2023,” Wiebe tells ILM.com, “when development work began with Jay Cooper (visual effects supervisor on The Eternals [2021] and The Creator [2023]) and a small team of artists, primarily to develop the look of Void. Then the WGA [Writers Guild of America] strike happened, and production went on hiatus for a while. ILM’s involvement picked up again in February of 2024. That’s when I got involved.”

(Credit: ILM & Marvel).

There can be any number of elements that bring an experienced supervisor onto a show. Wiebe explains how appropriate skill sets, personal interests, and timing align when taking on a new show.

“The production visual effects supervisor Jake Morrison and I have worked together a number of times and it’s always been a very collaborative experience, so I jumped at the opportunity to work together again,” Wiebe says. “On top of that, Sentry is a powerful new character in the MCU with a strong comic book legacy, and I really enjoy developing ideas for new characters.

“This was a very different film to the ones you typically see within the MCU,” Wiebe continues. “Jake Schreier’s vision was that it had to be grounded and based on physicality, not magic and energy and all the things you typically associate with superhero films. He didn’t want it to feel like any movie that we’ve seen before, so that instantly attracted me.”

As with any Marvel project set within the five New York City boroughs of Manhattan, Brooklyn, Queens, The Bronx, and Staten Island, the city is essentially a character in its own right. Whether it’s Spider-Man in Queens, Captain America in Brooklyn, or the Baxter Building in Manhattan, each location must feel authentic. In Thunderbolts*, we return to the former Avengers Tower, which ILM had to place within real-world Manhattan.

“There are two ways to look at it,” explains Wiebe. “One aspect is the kind of data acquisition you need in order to make these very tangible environments look realistic as if you’re standing there yourself, and the other is augmenting it with some very iconic structures such as the Watchtower, which needs to sit seamlessly within that environment. It’s a tricky thing to do when it’s a city like New York that people are very familiar with. When you’re building locations and areas that have a real-world counterpart, you need to do your homework. You need to make sure you get all the reference material to make sure you’re depicting it in the most accurate way because people will instantly spot things if you’re trying to cheat or fudge the facts, and New York holds a very special place in people’s hearts, so doing it justice was very important.

“The fact that they based Avengers Tower around the MetLife Building in New York was a great starting point,” Wiebe continues. “The Avengers Tower we’ve seen in previous films retains the base of the MetLife building, but the departure that we took on Thunderbolts* was that we redesigned the base of the tower so it no longer utilized any of the MetLife Building. We use the same city block and footprint, but we replaced it from the ground up. Beyond Avengers Tower, we also had to build vast sections of the surrounding area. The key is in the details and making sure you collect enough reference material such as digital photography, aerial plates, LIDAR scans – the whole nine yards to get as much data as possible so we can build out this environment to be a true depiction of New York City.”


One of the most striking elements of Thunderbolts* is a new character in the MCU, Bob, and his dark alter ego, Void. Both thematically and visually, his soul-sucking powers are a powerful addition to the film, taking inspiration from both the comics and the film’s director, Jake Schreier. The task fell to the artists at ILM to bring these concepts to life.

“It was a unique challenge to visualize Void’s powers without leaning into anything too typical or too magical. The way Void turned people into shadowed silhouettes being a prime example. It needed to feel like a subtle but impactful event,” says Wiebe. “There were a surprising number of iterations that we went through to figure out that look. We spanned the full spectrum of ideas, going from something that felt like a single frame flash, to longer, drawn-out versions showing detailed shadows projected onto surfaces in a variety of different ways. We tried different aesthetics before we arrived on a quick but impactful effect that had a complexity to its simplicity, which also relied on the audio design to sell it as this somber but impactful moment.”

The process from concept to completion required numerous iterations and refinements.

“We started shadow dev with Jay Cooper all the way back in May of 2023, and that wasn’t too dissimilar from what we continued to do all the way up to the final months of the show,” Wiebe explains. “With a pivotal character such as Void, getting it to a 90% or 95% point of completion is the easy part, relatively speaking. It’s dialing those nuances in the last 5% or 10% that’s a very iterative and collaborative process.

“There were some key shots that went through dozens of iterations,” Wiebe continues. “How much of Lewis’s performance are we preserving? How much are we shrouding him in shadows? How much specularity do we want to retain from his costume? It’s a fine line. You want to ensure you’re staying true to the actor’s performance because it’s so well done, but you have this character that you also need to convey as a mysterious, shadowy void, so you want to add that mystery and aura surrounding him without going too far. There was a lot of back and forth to determine what that balance should be. Once you crack the code, then you’re good to start propagating that through your other shots, and then the dominoes fall a lot quicker. It’s an important part of the process that we need to go through to land on that final look.”

With Lewis Pullman’s performance at the heart of the sequence, Void required a mix of disciplines to bring the character to life.

“A lot of what you see of Void relied heavily on a 2D composite treatment, mixed with our CG asset when we needed to add specific details to certain areas….so it’s a hybrid approach,” notes Wiebe. “We utilized as much plate material of Lewis as we could. We also augmented it to get some of the details that you may not have had in the plate. If there are areas that we want to expose, say a little bit of costume detail or parts of his cheek that we want to expose a bit of lighting information on, we would utilize our digital asset to help with that. For some of the wider shots where he’s further away or doing things that you couldn’t necessarily do while filming, those would be our digital versions.”


Work on projects like Thunderbolts*, with bespoke visual effects crafted for specific characters and powers, can lead to processes that are useful in future projects, something Wiebe is grateful for.

“Every project adds new tools to your tool belt that you can take from show to show. That’s what you build on, and that’s what you can offer up as things that you’ve already tried and have experience with. I’ve done a number of Marvel films, and there’s always a carryover of techniques, setups, and lessons that you learn from doing things a certain way that you try to improve the next time.

“In regard to Sentry (before he turns into Void), we really don’t know everything that he’s capable of yet, and I don’t think he does either,” Wiebe continues, “so a big part of Thunderbolts* was him figuring out what he was able to do and learning the extent of his powers. One of the key moments in the film was when he said to Valentina that he doesn’t need to take orders from her; why would a god take orders from a human? There were a lot of conversations about Sentry’s level of confidence and his attitude when he started realizing that he’s got these incredible powers. There was some exploration about how confident he should feel. Jake didn’t want him to necessarily come across as overly confident in his powers because he’s learning them from scratch, but he also wanted to play into Bob’s character, too, where he wasn’t a very assertive person. He obviously fell on hard times before becoming Sentry, so navigating through that was a bit of a challenge for him.”

Thunderbolts* doesn’t just feature Sentry. There’s also a burgeoning team of would-be superheroes to contend with. “Obviously, here you’re putting him up against a number of characters that have their own unique powers, and there are other superheroes that he shares attributes with,” Wiebe explains. “That was a consideration in this film, making sure we don’t mimic what’s already been seen with other characters. Sentry has unlimited powers; he can do a bit of what most of the other superheroes can do, so making sure that we didn’t share too much space with other distinct effects was key.”

Creating visual effects requires intense attention to detail and the necessity of watching a scene again and again and again, being as granular as possible to get everything exactly where it needs to be. Given that, Wiebe notes that because he already knows the story, “it can make it a bit more difficult to sit back and enjoy a project that you worked on at the movie theater,” as he puts it. “But the beauty of Thunderbolts* is that everything was so seamless; I was able to let it visually play out without any moments of scrutiny or second-guessing the decisions we made. It was very rewarding to finally see it on the big screen in all its glory with the final audio in a theatre full of people who were very excited to see it. There were people cheering and applauding at the end of the screening, which was super, super rewarding.”


The film is planned, shot, edited, the visual effects completed, the sound layered on, and the music scored, but looking back on Thunderbolts*, there’s a key scene that stands out: the fight in the former Avengers Tower between Sentry and the Thunderbolts which is one continuous shot.

“We shot the Penthouse fight in three sections and spent months doing previs to map out where the cameras needed to be and determine our capacity to shoot within a confined set environment,” Wiebe explains. “When Sentry ‘Force pushes’ Red Guardian through the window and back, it’s one continuous 45-second shot up until the moment he throws both Ghost and Walker out of frame. That was months and months of pre-production followed by months and months of post-production work. It really was a labor of love between a number of different departments within ILM and everyone who was on set making it happen. In terms of things that we’re the proudest of, that oner is definitely right up there and something we’re promoting in order to help pull back the curtain and let people see all the work that went into it.”

(Credit: ILM & Marvel).

Mark Newbold has contributed to Star Wars Insider magazine since 2006, is a 4-time Star Wars Celebration stage host, avid podcaster, and the Editor-in-Chief of FanthaTracks.com. Online since 1996. You can find this Hoopy frood online @Prefect_Timing.

The visual effects supervisor talks Cassandra Nova, Gambit, and more.

(Credit: Disney)

In a surprise twist midway through Deadpool & Wolverine, our titular protagonists are marooned in the Void: a Mad Max-like wasteland of desert and forgotten heroes. Their time in this multiverse purgatory takes up a significant chunk of the movie and features many of its greatest moments, from surprise character appearances to action set pieces, and Industrial Light & Magic was charged with bringing it all to the screen. Just from reading the script, visual effects supervisor Vincent Papaix knew that this section of Deadpool & Wolverine would be key to the movie’s overall success.

“Everybody was super motivated,” he tells ILM.com, “and we all knew that this movie was going to be special.”

Deadpool & Wolverine went on to become the biggest R-rated movie of all time following its theatrical release this summer, a true cinematic event during a challenging time for the film industry. To mark its arrival on Disney+, ILM.com caught up with Papaix to discuss how ILM realized some of the blockbuster’s impressive visual effects in the Void sequences. Grab a chimichanga and join us.

(Credit: Disney)

A sunny day in the Void

As fantastical as the Void may sound, director Shawn Levy and star/producer Ryan Reynolds (Deadpool) aimed to make it a believable setting and something that fans could relate to. “They really wanted it to feel as grounded and real as possible,” Papaix says. To achieve this, the filmmakers started the old-fashioned way, more or less.

“One thing for them was to shoot in natural conditions. That’s why most of the shoot was outdoors,” Papaix says. “So they shot in a landfill in the UK and in various locations in the UK to get that kind of natural light feel.”

This did create certain challenges, however. “You don’t control the elements,” Papaix says simply.

When Deadpool and Wolverine (Hugh Jackman) first arrive in the Void, they quickly proceed to beat the tar out of each other. This fight was shot outdoors in summer 2023 and then, due to the writers’ and actors’ strikes, finished in winter 2024. As a result of the pause and change in seasons, the color of the sky appeared slightly different. Though Levy and Reynolds initially hoped to digitally correct any inconsistencies in the look of the sky, ILM encouraged the filmmakers to keep this to a minimum. “One thing that was great about working with Shawn and Ryan and [Marvel visual effects supervisor] Swen Gillberg was that they are very collaborative,” Papaix says. “We did a few shots and a few tests and we realized the best outcome was to embrace the plate. So, based on the plate, if it’s sunny, let’s try to augment that. If it’s stormy, let’s try to be more stormy and then we’ll look at how it plays in the cut. And every shot was kind of hard to direct in that way. It’s making sure that it plays nicely, but if you look at the sequence, there’s some variation as you would have in a natural daylight. You can be in an area and within 10 minutes it can be from sunny to cloudy to stormy, depending on what is happening. So we focused on making it look as real as you can.”

Finally, to increase the grandeur of the Void, from scale to background elements, ILM came in to digitally augment what was captured in-camera.

“Our work was focusing on creating a seamless transition from the foreground set to a CG extension of the Void,” Papaix says. “Overseeing adding everything that was needed to the Void, including the detritus. There are all those different objects scattered throughout the Void. So obviously [we were] making sure they integrated, but the Void needed to feel real, and not feel like the foreground was on a stage in bluescreen extended into a CG world.”


A new villain emerges

The evil twin of Charles Xavier, Cassandra Nova (Emma Corrin) debuts in Deadpool & Wolverine as the ultimate authority in the Void. Unpredictable and hugely powerful, she’s a frightening villain that Wolverine and Deadpool must overcome. With guidance from Levy and Reynolds, ILM set out to illustrate her abilities in a subdued but unnerving way.

“She can control a lot of things with her mind,” Papaix says. “They wanted something fairly subtle to not overpower what was the power. It was important to show what it was doing to the people and not too much to [show] the power itself, not too much magic or anything. So it was more a subtle distortion to explain that there’s something happening.”

And what Cassandra does is indeed creepy: She seems to have a predilection for passing her fingers through the skulls of her enemies, including the Merc with a Mouth.

“We went through different aspects, from being creepy and caressing his face with almost spider-like fingers. All that was digital and a very complex simulation to kind of deform the masks in CG. What gets tricky is that it’s easy to do a collision, but we had to do a half collision and half penetration going through. So that’s actually a very complex simulation to control. And it was fully art-directed, meaning we had to control every aspect of the effect. We started with the performance of the fingers, and once we had the right emotion, then we worked on the simulation of how the mask should deform and, at the same time, kind of breaks open to let the finger go through.” In the end, Papaix was more than happy with the result.

“I read a lot of great reactions. People felt an itch, a little bit. It feels creepy but in a good way, because that’s definitely what the filmmakers were after.”

(Credit: Disney)

Johnny Storm’s quick exit

Cassandra sends Johnny Storm (Chris Evans reprising his original Marvel role from the Fantastic Four films) to a truly unfortunate demise, ripping off his skin and driving him into the ground. It’s a shocking moment—gruesome with a dark sense of humor—in a movie full of them.

“This was part of the script from day one,” Papaix says. “That was a moment that was very important for the filmmakers.” But where to begin for an effect so unlike anything previously seen in a Marvel movie? “Ripping out the skin was very graphic, so we had to study images.”

ILM turned to Real Bodies: The Exhibition, a long-running museum showcase that features actual human specimens, for reference. It made for a decidedly unique creative process. “The real [body in the exhibit] is very dry and has been preserved. We wanted to make it look fresh, so we had to add a lot of blood and liquids to make sure we felt that this just happened. So we are dripping blood, dripping fat. That was very gross. The daily session with the artists was always interesting.”

Once ILM knew how the effect should look, they began building a digital Johnny.

“The way we proceeded with this was creating an asset,” Papaix explains. “So a skeleton asset, we called it, with all the flesh and all the organs in there. We based everything, all the proportions, on Chris Evans. We have his scan. We created a digital version of Chris for Johnny Storm, even for the Human Torch version when he was on fire.”

Then it was time to get down to the de-skinning business.

“So we started from that and then we ripped off his skin. It’s pretty much what you can imagine, but in CG,” Papaix says with a laugh. “The shell of the clothes and skin were removed, revealing the skeleton with all the flesh. We tried to create some strings of blood coming out of him.” In an effort to maintain the series’ comedic tone, ILM added some elements to hopefully make this scene a little more Looney Tunes and less Hellraiser.

“It was kind of a cartoony moment, but in a good way — he has that moment blinking his eyes, and it’s like, ‘What just happened to me?’ And then he drops.”


Gambit gets his day

The Void segment culminates with Deadpool, Wolvie, and a band of fan-favorite heroes launching a siege against Cassandra and her forces. While fans delighted at seeing each back in action, one required visual effects that are essential to the character.

“A lot of attention was put to Gambit [Channing Tatum],” Papaix says. “We studied a lot of the comic books to see what was happening with his cards and [mutant power].” In the comics and iconic X-Men cartoon, Gambit charges playing cards, resulting in a purple glow; when he tosses them, they leave a trail and explode on impact. “We went for a various range of showcasing the power to the point that I remember a version where we probably went too far — too glowy and too flamey-looking. And that’s something that was not pleasing to Shawn, for good reason. He wanted to be grounded, again, to reality. So the cards — it’s the X-Men and all, but it’s important to have the cards telling the story.” 

As a solve, ILM illustrated a slower buildup of Gambit’s mutant power. “We were focusing mostly on the card and the energy within the card. There was a closeup in the cavern, when you see the card activating, and it’s within the pattern of the card. For the battle, we made some trails to be able to see it, because a card is very small. True to the comic.”


A lasting collaboration

Deadpool & Wolverine is a success for Papaix on several levels, from the commercial and critical reception to more personal reasons.

“I had the chance to work on the first Deadpool in 2016. Time flies. So this one already was quite special in my career, and having the opportunity to supervise the third one was also quite special. Knowing that Hugh Jackman was attached as Wolverine, there were so many good things.”

But looking back at the film, he seems to mostly value the time with Levy, Reynolds, and Marvel. “They were great collaborators. Obviously, he’s a director and he makes his call, but he was very keen on hearing people’s suggestions. But the collaboration for me is one of the highlights of the show with Swen and with Marvel, and pitching those ideas to Shawn and Ryan. They also thanked us. We know that’s something that not every filmmaker does, but at the end of the project we got a thank you video from Ryan and Shawn to share with our team at ILM, and it’s always fun to see that they appreciate the work. Obviously, they see the people on set, but when you do post-production, they receive the image. So they don’t really realize that we were 275 people making this happen. We did about 30 minutes of the movie, 614 shots, but it was a global team. It was mostly Vancouver and San Francisco, but also other ILM sites working with us. But it was 275 people. That’s quite a big group of people making it up to show those crazy visual effects on screen.”

(Credit: Disney)

Dan Brooks is a writer who loves movies, comics, video games, and sports. A member of the Lucasfilm Online team for over a decade, Dan served as senior editor of both StarWars.com and Lucasfilm.com, and is a co-author of DK Publishing’s Star Wars Encyclopedia. Follow him on Instagram at @therealdanbrooks and X at @dan_brooks.

Step inside the film — with Sprite, an Eternal, as your guide. Go on an epic Augmented Reality adventure through time and space to discover the truth about humanity. Enter the world, learn the backstory, and meet the characters, in Marvel Studios’ first Immersive Story Experience. This mini prequel lets you explore the story like never before and become a part of the action.

Marvel Studios, Industrial Light & Magic, and the Technology Innovation Group at Disney Studios Content have teamed up to bring the Eternals to your living room through an exciting augmented reality experience.

Industrial Light & Magic is thrilled to announce an exciting partnership with Disney’s Technology Innovation Group on Marvel Studios’ Eternals: AR Story Experience for the iPhone® and iPad®. In this augmented-reality app, the characters, world, and stories of the Eternals film have been brought to life like never before.

“I was so excited to get the call to come work on the Eternals: AR Story Experience,” said Danielle Legovich, Visual Effects Producer at ILM’s London studio. “What I found wonderful throughout the process was the incredible collaboration with the team at Disney, along with all of the content creators on the project. They were all such lovely people, and we were able to combine talent in such a profound way. With Disney coming from that background of games and apps, and ILM coming from the visual effects point of view—and having worked on a large portion of the Eternals film—it made for a really wonderful partnership. At ILM, we’re used to creating images that people view in a darkened cinema, so to be able to work on images that people would then bring into their home through AR was so much fun. The experience became so immersive for me personally during the creative process, that I would imagine this Deviant just exploding up from my kitchen floor. I really loved that.”

Film-quality VFX assets used in the app. Image courtesy of Disney Studios Content / Marvel Studios.

As you might expect, what makes this experience so incredibly unique is the augmented reality aspect. You’re able to literally step into that world and meet these characters without having to leave your home. That level of detail brought a host of exciting challenges. Edmund Kolloen, Computer Graphics Supervisor on the project, goes on to explain, “What was thrilling for me was trying to get the same quality that we would push out of a final render, and get that to look and act the same in a real-time application. The challenge, of course, was getting that data from our render package and pushing it to the pipeline at Disney. There was a lot of really great cross-collaboration on both sides, building it as we went along. The results were amazing. You can walk right up and have a look at these characters, because they’re the exact same digi-doubles from the film. We worked diligently to ensure that every facial expression and every emotion comes through.”

Recording actress Lia McHugh using over 100 cameras. Image courtesy of Disney Studios Content / Marvel Studios.

On the Disney front, the Technology Innovation Group developed a host of new cutting-edge tools and techniques during the iterative process of translating that data from ILM. Evan Goldberg, Manager, Technology Innovation Research at Disney Studios Content recounts, “We had a small but mighty team here at Disney to put this project together. Daniel Baker was the Producer, and was the beating heart and metronome of the project. I’ve been here at Disney for sixteen years, with a history of feature film production, animation, and VFX experience. My role on this project straddled the line between Tech Supervisor and VFX Supervisor. Both my team and Industrial Light & Magic really wanted to come from a place of authenticity for the experience, and to be as faithful as we could to the source material. When you see a still from the AR experience, it should feel like a still from the film, and we were able to do that by working directly with ILM. They were very open to adapting their pipeline to conform to what we needed on our end. That allowed us to collaborate more quickly, and make something that had a visual fidelity on par with the film, but rendered in a fraction of a second. It was so incredible to see new technologies born out of that process.”

Pre-visualization of in-app scene. Image courtesy of Disney Studios Content / Marvel Studios.

With all of the innovation, the teams still had a daunting undertaking. They had to create a cinema-quality AR experience, and one that would carry the Marvel Studios name on top of it. “We knew that we had to match the quality of what people see in the cinema,” explains Daniel Baker, Senior Producer and Manager, Technology Innovation at Disney Studios Content. “The iterative design process with ILM was so helpful, because it ensured that we were always working with the latest assets. The pre-visualization work, along with that review and iteration process, was really exciting. Since we were working from home, and across multiple time zones, we had to really make the most with the time we had. So to get everyone to go out into their backyard to play with the experience, and really give it that high level of scrutiny and pixel-by-pixel accuracy, was a lot of fun.”

Kolloen summarizes the overall Eternals: AR Story Experience perfectly, “one of the exciting things for me was to see the Deviants in that augmented reality environment. Nothing prepares you for the moment you walk outside with your iPad® and see this creature that’s the size of your house.”