ILM Evolutions: Pushing the Boundaries of Interactive Experiences

50 Years | 500+ Film and TV credits | 135+ Awards

SINCE 1975

Aug 11, 2025

Celebrating Ten Years of Immersive Entertainment at ILM

By Amy Richau

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and highlights from Industrial Light & Magic’s 50 years of innovative storytelling.

In immersive stories, the fan is the hero.

Industrial Light & Magic (ILM) has always been at the forefront of innovation, drawing audiences into new worlds by pushing technological boundaries. As part of our special series, “ILM Evolutions,” ILM.com talked with Vicki Dobbs Beck (vice president, immersive content innovation), Julie Peng (director of production), Tim Alexander (visual design director), Ben Snow (senior visual effects supervisor), and Shereif Fattouh (executive producer) about the past, present, and future of ILM’s immersive storytelling.

ILM’s LiveCGX team, including visual effects supervisor Mohen Leo (bottom).

A New Way to Tell Stories

“Let’s invite our fans to step inside our stories in ways that had never before been possible.”

 – Vicki Dobbs Beck

While ILMxLAB was formally established in 2015 to explore the possibilities of immersive storytelling, the seeds of this endeavor actually began much earlier for Vicki Dobbs Beck. In the 1990s, Beck worked at Lucasfilm Learning, where a certain prototype caught her attention. “It was called Paul Parkranger and the Mystery of the Disappearing Ducks,” Beck tells ILM.com. “And what was really cool about all of the projects we were doing at that time is they really did sit at the intersection of storytelling, interactivity, and high-fidelity media – such that it was back then – all through an educational lens.”

Fast-forward to Beck’s time at ILM as head of strategic planning, she started bringing together talent from ILM and LucasArts (now Lucasfilm Games). “It was kind of this little rebel unit that was doing some pioneering R&D [research and development] in high-fidelity, real-time graphics,” says Beck. “Their success gave us confidence that the foundation was in place to build an immersive storytelling studio, expanding on the R&D work done by teams like the Lucasfilm Advanced Development Group (ADG).”

In 2015, ILM and Lucasfilm announced the formation of ILM’s Experience Lab (ILMxLAB) – a new division that would combine the talents of Lucasfilm, ILM, and Skywalker Sound. Lynwen Brennan, then Lucasfilm executive vice president and ILM president, announced that “The combination of ILM, Skywalker Sound, and Lucasfilm’s story group is unique and that creative collaboration will lead to captivating immersive experiences in the Star Wars universe and beyond. ILMxLAB brings together an incredible group of creatives and technologists together to push the boundaries and explore new ways to tell stories. We have a long history of collaborating with the most visionary filmmakers and storytellers, and we look forward to continuing these partnerships in this exciting space.”

The Holocinema team.

From its inception, a studio telling immersive stories with emerging technology platforms made ILMxLAB an appealing destination. Julie Peng, who worked in Lucasfilm Animation as a production manager for projects like Star Wars: The Clone Wars (2008-13) and Strange Magic (2015), was looking to break into the emerging interactive storytelling space when she received a call about a new ILM division that would focus on technology like augmented and virtual reality. “When we started, we were five people,” remembers Peng. “I did my best to take care of any need that arose, big and small, from developing the production infrastructure to writing job descriptions, to ordering pizza and running to the store for batteries. It was about doing whatever was needed to build a team and start exploring what we could bring to the immersive entertainment space.”

In 2016, the studio debuted its first VR experience, Star Wars: Trials on Tatooine, where the Millennium Falcon lands in front of players, and they help R2-D2 and Han Solo with repairs. This was an important step in the studio’s goal of creating a living world. “Everybody was just so blown away by the scale,” says Beck, “because that’s something that VR is so good at – delivering scope and scale.”

Looking to the future was also always a part of the plan. “Because we were so early in the whole immersive storytelling space, we really wanted to help drive the industry,” says Beck. “So we actually very consciously shared our prototypes in public. We spoke about them. We made them available to people because we wanted to actively inspire others to create in this space alongside us.”

Over time, the team evolved into a mix of creatives from the film industry and people with backgrounds in games and interactive development. While bringing in developers with both backgrounds was essential, it also brought challenges for Peng in her role as production manager. “Early on, I realized that they spoke two different dialects,” says Peng. “They used similar terminology, but their approaches to making a creative product were quite different in terms of process and priorities. I found myself becoming a bridge, translating concepts and driving the development of a common language so we could all communicate effectively.”

The team also had to be comfortable with fluidity as the technology they were working with was constantly evolving. Peng noted staying abreast of what was going on in the industry was key, as well as the leadership team being willing to take some risks. “I always call it ‘holding hands and jumping off the cliff together’.”

The First Big Leaps

“It [VR] really is like stepping into a different world, and it feels totally natural once you’re there.”

– Julie Peng

Visual effects supervisor Tim Alexander became involved with ILMxLAB after a history in traditional visual effects, including the 2015 blockbuster Jurassic World. He was also a lifelong gamer intrigued by the work ADG was pioneering at the time: bringing real-time, game engine-type techniques into visual effects. When director Alejandro G. Iñárritu approached ILM about a collaboration on a virtual reality project, Alexander came aboard as visual effects supervisor. The result was CARNE y ARENA, which debuted in 2017.

Still early in ILMxLAB’s history, CARNE was an ambitious project involving a short VR piece bookended by physical experiential rooms that put the audience into a story of immigrants being detained while crossing the border from Mexico to the United States. At the beginning of the experience, participants are brought into a physical holding cell (where the temperature inside is cold) where they have to remove their shoes and items like backpacks. “There are ambient noises and real artifacts like abandoned shoes that have been found in the desert, from people crossing,” notes Alexander.

The key art for CARNE y ARENA.

Participants are fitted with a VR headset and led barefoot into a 50-foot-by-50-foot room full of sand. In the VR portion of the experience, they assume the role of a group of immigrants attempting to cross the U.S.-Mexico border at night when they are stopped by U.S. Border Patrol agents. After the VR story, participants exit and are led down a hallway where video monitors play interviews of the real people CARNE is based on. “He [Iñárritu] cast people that had crossed the border as the people within this experience and wove a story around that, so you actually see the real people and hear their experiences,” says Alexander.

CARNE was a challenge from an artistic and engineering standpoint. What Iñárritu and Alexander wanted to do was sometimes hindered by the current technology. Wanting the images to appear as photoreal as possible, the team realized the immersive film’s computing requirements outweighed what was possible in headsets at the time, so Lutz Latta, ADG graphics engineer, designed a supercomputer with four high-end GPUs (graphics processing unit) to handle work such as calculating shadows in the film.

Other challenges included allowing participants to traverse and turn around in a 50-foot-by-50-foot room. “At the time, there was no way to really run a VR headset over more than 100 feet. You were lucky if you could get five feet away because of the HDMI cables and all kinds of things,” remembers Alexander. VR tracking abilities at the time were also far below where ILM’s engineers wanted them to be. “So then we started mixing in stuff that we know from visual effects of how to track cameras in large spaces. A motion capture stage was built to track the headset instead of what we would usually track the camera in. So it started becoming a mixture of different things that we knew how to do for different reasons, and kind of applying it to this situation.”

A final frame from CARNE y ARENA.

The newness of the technology and the goals the team wanted to achieve with CARNE meant everyone had to adapt and be ready for anything. “It was the first project in my career that I was actually concerned that I would not be able to deliver,” says Peng. “Because in all of my past projects, I would have a production plan, A, B and C, D and E in my back pocket. We were working with new technology and making something that had never been created before. There was no model of how to do that, which made me feel like I was operating without a parachute. That can be very nerve-wracking but also exhilarating when you actually finish the project. That sense of completion and accomplishment was huge.”

Audience reactions to the very visceral experience were all over the map. “We had people that really wanted to get into the middle of it, and they would look at every character and perhaps even jump behind a virtual bush to hide, while others might hang back to observe the scene, whether due to fear or other emotions that came up,” says Alexander. “The overall sense that I got was that people really understood what Alejandro was trying to say,” notes Alexander. “They heard it, and they understood what he was trying to express through that story.”

The studio’s debut with The VOID, Star Wars: Secrets of the Empire, also took place in 2017. At The VOID, up to four fans would suit up with their gear: a VR headset connected to a backpack laptop and haptic vest. From there, teams of fans were immersed in ILM’s digital world; in this case, Secrets connected to Rogue One: A Star Wars Story (2016), giving fans an adventure of a lifetime on Mustafar near Vader’s castle. While infiltrating an Imperial base, they would traverse the facility together and try to recover a key artifact.

Dropping Into the Story

“If we’re creating an experience, we want people to feel like they’re genuinely in a Star Wars project.”

– Ben Snow

Vader Immortal: A Star Wars VR Series

While working on Secrets of the Empire, Ben Snow (visual effects supervisor for Star Wars: Attack of the Clones [2002] and Iron Man [2008]) was recruited to work on a connected story that was in development for home use, Vader Immortal. In the project’s early days, a prototype was put together to see what it was like to be in the same (virtual) space as Darth Vader – spoiler: it’s terrifying. At the same time, Oculus was quietly working on the first Quest headset, revolutionary with its tetherless execution, which ultimately influenced the amount of lightsaber play in the story. The stars and companies aligned, and Oculus Quest became the platform partner for release.

In Vader Immortal, fans take the role of an unnamed pilot who finds themself inside Darth Vader’s castle on Mustafar. The fan’s interactions with Vader were, of course, key to the success of the project. “Mustafar should be scary,” notes Snow. “The confrontation of meeting Vader should be scary. Because that’s what he is.” The Immortal team used scans of Vader’s costume from Rogue One (based on the original 1977’s Star Wars: A New Hope) and built on them with new scans to push the realism even further.

Concept art from Vader Immortal: A Star Wars VR Series by Russell Story.

The team worked internally with Lucasfilm to develop story ideas for three episodes. David S. Goyer, screenwriter of The Dark Knight Rises (2012), wrote scripts around them, injecting his own characters. These were similar to traditional film scripts, which then had to be made more interactive by adding lines of dialogue for the fan to perform certain tasks. The production brought together the film and games world as they put it all together. “In film visual effects, you get a script, you break it down. These are the assets you have to build,” explains Snow. “Interactive entertainment is much more free form and evolutionary. It was an interesting blend between those two mediums.”

The goal with Immortal was always the same: create an experience unique to virtual reality that you couldn’t experience by watching a movie. “One of the things that excited us,” says Snow, “was this was a chance to eavesdrop on Vader a little bit. We had the moment where Vader takes off his helmet, and he’s looking at a memory, almost, of Padmé. You’ve been climbing around, find yourself in Vader’s chamber, and you’re peering through these walls at him. We felt that moment of actually being an interloper, and seeing a side of the character you hadn’t seen before was something that was unique to what we could do in VR.”

Another element distinct to Immortal was making Vader the fan’s own teacher during the experience. The Sith Lord’s introduction is fittingly terrifying for such an iconic character. Initially, when Vader stepped up to the fan, he had a few lines of dialogue. But those lines were eventually cut after some internal tests of the experience. “Vader’s in the distance, and he comes toward you, and you hear the heavy breathing and footfalls,” says Beck, “and he keeps walking toward you, and it becomes more and more intimidating. Almost no one heard the dialogue because you’re so overwhelmed by his presence that it’s all that you can absorb.” Adding to the power of that moment was the eye-tracking in the experience, so no matter the height of the fan, Vader was looking right at you. “And the fact that you’re being acknowledged by a character like Vader is just mind-blowing,” adds Beck.

Actor Maya Rudolph (left) performs the voice ZOE-3 in Vader Immortal as director Ben Snow (middle) and writer and executive producer David S. Goyer look on.

What If…? – An Immersive Story

Shereif Fattouh came to ILM from an AAA games (high-budget, high-profile games from large studios) background, working on titles like Battlefield and Dead Space at Electronic Arts. Interested in story-driven projects, Fattouh worked on The VOID projects Ralph Breaks VR (2018) and Avengers: Damage Control (2019). The development of a new headset, the Apple Vision Pro, led to Fattouh’s involvement in What If…? – An Immersive Story (2024), an experience that uses both mixed reality and virtual reality in addition to hand and eye-tracking through Apple’s innovative headset technology.

Marvel Studios’ What If…? series gave the developers a great amount of freedom in one of the most popular story worlds on the planet – the Marvel Cinematic Universe. “What If…? is such a great vehicle from the comic books and then to the animated show, where you get to just play in a sandbox,” says Fattouh. “What if this happened, and it’s a completely different version of it, and that kind of creative freedom just allowed us to tell the story that we wanted.”

Similar to previous ILM projects,the What If…? team was working on a project without the tech they would need to bring the experience to fans, as the Apple Vision Pro was being created in parallel. “We started development really early on,” says Fattouh. “It was a great collaboration with Marvel Studios, Disney+, and Apple, but we were definitely doing early, early testing and kind of figuring it out as we went.”

A final frame from What If…? – An Immersive Story.

What If…? – An Immersive Story took about 18 months from the conception of this particular idea as an experience to arriving in fans’ hands. Getting there involved finding the balance between the fans watching the story unfold and directly engaging with the characters and environments. “It’s really subjective,” says Fattouh. “There’s no right answer. How much do we want the audience to really observe this amazing story that’s being told and being kind of talked at versus going in and doing things and impacting the narrative? So that was really one of the biggest challenges throughout the whole life cycle. Playtesting it and figuring out, ‘Okay, is it feeling right? Is this beat too long? Is it too short? Do we want to have people jump in and get into the action a little bit faster?’”

During What If…? – An Immersive Story, the Watcher enters the room where a fan is situated. Throughout the story, fans see and interact with versions of some of their favorite Marvel heroes and villains, including Wong, Thanos, Hela, and Wanda. Fans are active participants in the story and get to use iconic items from the Marvel universe, like the Time Stone, to move the story forward.

Fattouh also notes how What If…? gives fans a unique way to experience a familiar Marvel moment near the beginning of the experience. “You don’t really know what’s going on because it starts with a disembodied voice, and you’re in space,” says Fattouh, “and then we kind of kick off in a very Marvel way, where it has that iconic Marvel logo flip book entry. But we did a very 3D spatialized version, where it’s coming into your living room. Just getting to see the smile on people’s faces when they saw something they’ve seen a lot in the films, but to see it really coming out in your living room … it set the right tone of, ‘Oh, this is something different.’”

Marvel director Dave Bushore (center) confers with Immersive crew members during production of What…?, including: Maya Ramsey, Patrick Conran, Marissa Martinez-Hoadley, TBD, and Joe Ching.

What the Future Holds

After ten years, the team remains small, retaining its nimbleness on a quest for innovative excellence. Working with multiple partner studios and collaborators, the immersive team staggers projects, with typically two in production at a time, and with a production timeline of between 12 to 24 months. “I think over time, our goal will be to expand that capacity and capability,” says Beck. “It might mean expanding it in other studio locations – maybe in London or in Vancouver. The size of the team we have is really nice because everybody knows each other. We can iterate together, and that’s a really important part of interactive, immersive experience development.”

The immersive team has high hopes looking to the future as the technology reaches a wider group of people. “Venues like Star Wars Celebration are always amazing,” says Peng, “because the technology is still growing, and it gives us a chance to share our stories directly with fans. It’s also rewarding to see the accessibility of our experiences making it feel entirely organic and inclusive for everyone.”

Beck looks forward to hands-free AR glasses that can deliver a high-fidelity image with a wide field of view. “We are very excited about this idea of storyliving at city or world scale,” says Beck. “Geo-located content where you could be out in the world in your glasses and little story moments would unfold in the real world.” Beck also sees more people who don’t consider themselves gamers gravitate towards immersive stories. “And I think that’s really great for us because we’re interested in that intersection of story and interactivity and putting you at the center of that experience.”

ILM’s team on What If…? won an Emmy for Outstanding Innovation In Emerging Media Programming. From left: Elizabeth Walker, Ian Bowie, Lutz Latta, Marvel’s Dave Bushore, Vicki Dobbs Beck, Mark Miller, My-Linh Le, Julie Peng, Pat Conran.

Looking ahead, the future of immersive stories is limited only by the imaginations of writers, designers, and engineers devoted to bringing these experiences to audiences. “I think that there’s a huge opportunity for ILM in immersive entertainment broadly defined,” notes Beck. “When we first started, the word “immersive” almost always meant virtual reality, then it included augmented reality, and eventually mixed reality. But now, it’s being used to include linear content or pre-rendered content, but that’s very immersive through screen technology, like Abba Voyage (2022), as an example. The opportunity is to take our talents across the global studios, which include the highest quality visuals and sound, and couple that with the real-time understanding and capability, bringing those things together. I think that we’re going to start to see an increasing desire for interaction, where you are actually in an experience, doing something meaningful that makes the overall experience even more personal. And beginning to understand what that is and taking steps toward a storyliving future. I think that’s the big opportunity for ILM.”

Currently in development in partnership with Meta Quest is Star Wars: Beyond Victory – A Mixed Reality Playset which takes players into the fast-paced, high stakes life of a podracer.

Read more “ILM Evolutions” stories here on ILM.com.


Amy Richau is a freelance writer and editor with a background in film preservation. She’s the author of several pop culture reference books including Star Wars Timelines, LEGO Marvel Visual Dictionary, and Star Wars: The Phantom Menace: A Visual Archive. She is also the founder of the 365 Star Wars Women Project – that includes over 90 interviews with women who have worked on Star Wars productions. Find her on Bluesky or Instagram.