San Francisco News

50 Years | 500+ Film and TV credits | 135+ Awards

SINCE 1975

Tha layered shading system has been the standard at ILM for many years as it continues to impact the wider visual effects and animation industries.

By Lucas O. Seastrom

(Credit: Academy of Motion Picture Arts and Sciences).

2026 marks Industrial Light & Magic’s 39th Scientific and Technical Award from the Academy of Motion Picture Arts and Sciences. The recipient innovation “Lama” – its name derived from the first two letters of each word in the term layered materials – is the first modular, production-ready, commercially available layered shading system of its kind in the visual effects and animation industries. Recognized on the award are Lama’s lead originators, including former ILM lookdev supervisor Jonathan Moulin, and former ILM rendering engineers Vincent Dedun and Emmanuel Turquin.

The concept for Lama first emerged ten years ago as a means to solve what had become a common problem with shading and rendering computer graphics imagery. A typical layered material network helps to define how light interacts with a digital surface like metal, wood, or skin. Light can reflect off a surface, but it can also refract between multiple, differing layers. Until 2016, material systems were commonly made specifically for the types of imagery in a given production. They were rigidly designed and often difficult to share between different productions. This inflexibility made it challenging for artists to adjust their work quickly while still maintaining the realistic dimensions of their images.

“In the early days of rendering – i.e. writing shaders to make objects look like real objects that are in fact CG objects – we had purpose-built shaders,” explains principal R&D engineer André Mazzone, who has been involved with Lama since its inception and currently manages the product. “There were shaders for glass, skin, metal and everything else. It was insular and isolated. Then there was a period when we developed general purpose shaders that would combine multiple properties. In certain cases, some parts of an asset might be clear but others might be opaque. For example with an eyeball, there’s a white, cloudy area but then there’s a transition into a transparent region where the lens is focusing light onto the retina. This blending needs to be smooth, so we require an expressive shader that comprises all of these behaviors. General purpose shaders were fixed in their designs as templates. If we wanted additional behavior, we had to jump in and code it. On Rango, they needed more dirt controls, so we had to splice in new pieces of code to make upgrades. That’s how it used to work.”

To eliminate this often cumbersome process, Lama was envisioned as a modular system where materials are layered and combined without the need for customized code. It’s a simple, lean, and artist-friendly method that ensures both physical accuracy and creative flexibility.

“The way Lama decomposes material responses is akin to the historical bespoke shader solutions for different materials, but the glue is now something that an artist can apply instead of an engineer,” explains Mazzone. “The engineering job is to provide all of the building blocks that might be needed, and the artists can make new additions themselves. This is Lama’s true strength. It employs an infrastructure that conserves energy across material layers. We had experimented with this in the past, but not in a way that allowed general arbitrary layering. This commitment to automatic physically-inspired energy conservation while rearranging components is what has made this tool so flexible and useful.”

Starting as an incubator project at ILM’s London studio in 2016, by mid-2017 Lama was already being used in productions. Disney’s Aladdin (2019) was the first to receive full Lama deployment to great success, and later, Terminator: Dark Fate (2019) resulted in the tool’s deployment throughout the wider network of ILM’s studios. “Any film that includes CG elements from our main-line pipeline – hero creatures, crowds and environments – has been 100% powered by Lama since 2019,” Mazzone notes. That includes episodic series like The Mandalorian, Skeleton Crew, Andor, and many of the Marvel shows. All main-line assets at ILM now go through Lama.”

2019’s Aladdin was ILM’s first production to fully integrate Lama (Credit: Disney).

However, this was only the beginning of Lama’s impact. At the same time that ILM fully integrated the system, it began sharing Lama’s possibilities with sister companies, Pixar and Walt Disney Animation Studios. Pixar was so taken with it that they chose to adapt the tool into their iconic RenderMan product. Lama first premiered with RenderMan 24 in 2021, and since then, studios across the industry have benefited from this ILM-grown innovation, including Laika, DNEG, and MPC, among others. Pixar’s newest feature Hoppers is just one example, wherein Lama’s workflow for hair, fur, and feathers was utilized to great success.

“Most importantly, Lama shifts the artist’s mindset,” says Mazzone. “Materials are now no longer abstract parameter blends, but substrates and layers, much closer to their real-world counterparts. They can be developed independently and combined later, improving efficiency and giving artists and engineers a clear, shared language. This balance, simplicity at the surface, and complexity through composition, makes Lama both approachable for artists and robust in production, enabling faster iteration and higher quality outcomes.”

Congratulations to Jonathan Moulin, Vincent Dedun, Emmanuel Turquin on their Scientific and Technical Award, and to everyone at ILM who has supported Lama’s continued development, including engineering lead and current product owner André Mazzone, former rendering engineer Henrik Dahlberg, rendering engineers Sam Cordingley, Alain Hostettler, Chong Deng and Khang Ngo, and lookdev supervisors Hugo Debat-Burkarth and Joseph Szokoli.

See the full list of Scientific and Technical Award Winners for 2026.

To learn more about Lama, visit RenderMan’s website.

Lucas O. Seastrom is the editor of ILM.com and Skysound.com, as well as a contributing writer and historian for Lucasfilm.

Behind every complex shot is a network of people supporting, teaching, coordinating, and looking out for one another. Drawing on perspectives from animation, production, training, and talent management, this article looks beyond the work on-screen to explore how everyday behavior, collaboration, and care shape life inside ILM’s Vancouver studio.

By Jamie Benning

(Credit: David Dovell & ILM).

When you arrive at Industrial Light & Magic’s Vancouver office, located in a unique skyscraper known as “The Stack,” the lobby displays creatures, props, and costumes tied to the company’s history, while the view beyond the windows reveals one of the most distinctive environments in the ILM network. Glass, steel, ocean light, and mountain silhouettes frame a workspace where some of the most technically complex and creative imagery in modern filmmaking is created. The Vancouver studio is shaped by its artists, influenced by its location, and sustained by a culture built on collaboration and shared purpose.

This portrait of ILM’s Vancouver studio emerges from conversations with people working across very different roles inside the studio: senior visual effects trainer Matt Leonard; lead animator Wesley Chandler; senior talent management coordinator Riya Ramani; and visual effects production coordinator William Wu. Their perspectives are reinforced by insights from Toban Taplin, executive in charge at the Vancouver studio, whose role bridges creative leadership, operations, and long-term studio strategy. Across these conversations, a consistent theme emerges. The Vancouver studio is a place defined by people who support each other, a city that inspires them, and a culture that reflects the best of ILM’s past and present.

Leonard’s role as senior visual effects trainer places him at the center of artist support, sharing knowledge across the studio as tools, workflows, and expectations continue to evolve. As a lead animator, Chandler works directly on performance and motion, guiding teams through some of the most creatively demanding sequences on ILM’s projects. Ramani, as senior talent management coordinator, sits at the intersection of people, logistics, and wellbeing, helping ensure that crews are supported not just creatively but sustainably. From the production side, Wu’s role as visual effects production coordinator focuses on communication and continuity, tracking work as it moves between departments and making sure artists have what they need to do their jobs effectively.

People


The Vancouver team consistently describes an environment shaped by openness, humility, and care. Matt Leonard, who works across ILM’s global studios, sees this as one of the company’s defining characteristics.

“That was one of the things that really drew me to ILM. From the outset, it felt like a very humble group of people. Having been here nine years, it still feels like there are no egos at all, which is staggering when you think about the calibre of people who work here.”

That absence of ego shows up, not as a slogan, but in everyday interactions. Production staff move between desks, checking in on shot progress. Artists gather for dailies, where work is reviewed openly, with feedback offered constructively from all present. Trainers circulate through departments answering highly specific technical questions. Talent managers quietly track crew wellbeing alongside schedules and contracts. The studio functions as an interlocking system, where each role supports the other.

That sense of care is reflected not only in how people are supported during difficult moments but also in how their time and energy are respected between projects. Wesley Chandler recalls how that approach stood out to him early on.

“I really loved how artist-focused ILM tries to be. That stood out to me quite a bit. I was finishing a project, and my talent manager at the time asked me, ‘Do you want to take some time off after this?’ Then I asked, ‘What do you mean?’ Usually, in visual effects, you go from one very busy project straight to the next. The idea that people could take time off if they wanted to really stood out to me. It felt like they genuinely wanted to make sure artists were well taken care of.”

For Toban Taplin, that environment is not accidental. His own background as an effects artist continues to shape how he thinks about leadership and studio culture.

“When I look back at my time as an artist, the places where I did my best work were the ones where the environment was good, and the people around you were all pulling in the same direction. The challenges on a show don’t feel quite so daunting when you’re sitting next to people you get on with, and feel supported by. A big part of my job is helping to create that environment so people can do their best work.”

For many, that sense of support extends far beyond project deadlines and delivery schedules. Chandler joined the Vancouver team when the industry itself was undergoing significant change, and he experienced that culture at a deeply personal level. “I’m incredibly grateful for how ILM supported my family and me, including giving us time to process a loss in the family. It really felt like they cared about my well-being as a person, not just what I could produce at work.”

That feeling of being valued as a person, not just as a contributor to a shot or a sequence, echoes across departments. Riya Ramani experienced that sense of belonging so strongly that she returned to ILM after a period working abroad. “My journey through different studios eventually led me back to ILM in Vancouver, which I now consider my ohana. What brought me back wasn’t just the work, but the people and the genuine sense of community that makes this place so valuable.”

Even those at earlier stages in their ILM careers feel actively encouraged to participate, learn, and grow. Staff describe an environment where questions are welcomed and curiosity is rewarded, creating a studio culture that supports learning alongside delivery.

Across every role, from production through artists, training, and talent management, the language is consistent. People feel supported, listened to, and encouraged to ask questions. It is a culture built as much on kindness as it is on craft, where emotional intelligence is valued alongside technical mastery.

While the work on-screen often draws the public spotlight, the Vancouver studio is sustained by a much wider network of expertise. Production, talent management, training, facilities, IT, and operations all work in parallel with the artists. Schedules are shaped, careers are guided, systems are maintained, and problems are solved quietly in the background.

Taplin recalls a message forwarded to him by a manager, written by an artist after an ordinary day at work. “They talked about coming into the studio, having breakfast that morning, then later picking up their production gift, and finding hot chocolate and donuts waiting upstairs. They were working on a Star Wars project, surrounded by memorabilia, and they said it felt like they were living their best life that day. Being able to share that feedback with the teams who created that experience is really important. It helps people see that what they’re doing matters.”


Place

Vancouver’s geography is central to the experience of working here. The proximity of mountains, forest trails, and the Pacific Ocean offers people across the studio a balance that many describe as both grounding and energizing. It is a city where an intense day at the workstation can be followed by a swim, a hike, or an evening on the beach. The natural world sits unusually close to the digital one.

Matt Leonard explains the appeal of the surrounding environment. “Within 10 or 20 minutes, you can cross a bridge into the North Shore and suddenly be in the mountains, or head the other way and be on the beach.” For Chandler and his family, that access to the outdoors is part of daily life. “My wife, daughter, and I love the outdoors! There are so many trails around here. We love to do a lot of hiking and camping!”

For William Wu, the character of the city runs deeper than its landscape. Vancouver’s multicultural identity shaped his upbringing and continues to shape his experience at ILM. “For me, Vancouver is home. Growing up in an Asian household, I was never tied to just one culture or one community. I was always surrounded by different cultures, and that became normal. People here are curious about what you appreciate in your culture, what you do for holidays, what your day-to-day life looks like. There’s a real willingness to learn and be open, and people are very kind and respectful. Vancouver is incredibly rich and diverse, and it doesn’t feel like anywhere else in the world.”

Taplin’s own relationship with the city began as a short-term experiment that became something more permanent. “We moved here on a whim, thinking we’d try it for a year. What made us stay was how accessible everything is. I live on the North Shore now, and within 15 or 20 minutes, you can be on a mountain trail, skiing in the evening, or hiking above the clouds. Even on the many grey, rainy days Vancouver has, you can drive up into the mountains, and suddenly you’re above it all, in the sunshine, with snow all around you. That ability to escape so quickly is pretty amazing. You’re immersed in nature all the time, and that’s incredibly inspiring.”

Vancouver has fully embraced its identity as a production city, with everything from major studio features to independent films and television series shooting across the region. Ramani notices that industry presence almost daily. “Working full-time at the office has its perks – our window overlooks Melville Street, where my colleagues and I have had a blast watching camera crews filming outside The Stack.”

That proximity to live production and nature feeds directly into the studio’s creative energy. Forests become reference, shifting Pacific light influences how people observe color and atmosphere, and rain, mist, rock, and water subtly inform the textures seen on-screen. Vancouver is not just a place where ILM happens; it actively shapes how people here see and imagine.

Author Jamie Benning (left) chats with Matt Leonard (Credit: David Dovell & ILM).

Culture

ILM’s global culture is rooted in a long tradition of collaboration, problem-solving, and shared creative ownership. The Vancouver office reflects that tradition, while adding its own local energy and character.

Training plays a central role in how ILM maintains that culture. Matt Leonard introduces new artists not only to the studio’s tools and workflows, but also to its history. “We run sessions on the history of ILM where we show images from the early days and talk about the people who built the studio. It helps new artists feel part of a much bigger story.”

Access to senior artists and long-time ILM innovators is another constant. Knowledge is not hoarded. It circulates. “You can talk to almost anyone in the company and say you’re struggling or ask how something works,” Leonard says. “People genuinely want to help.”

That openness is visible every day in Vancouver. Wu recalls moments when simple questions lead to unexpected insight, even on landmark films. “I remember someone sending out a question about Jurassic Park, and people who actually worked on the film replied with real details about how those shots were done. It really shows how open the culture is.”

The studio’s hybrid work pattern provides flexibility, but in-person collaboration remains important for many. The ability to sit alongside someone, sketch an idea, or solve a problem together still carries enormous creative value.

“Working from home has brought flexibility that people really value,” Wu explains. “But what being in the studio brings to the collective is different. When senior artists sit next to someone who hasn’t been in the industry for 20 years, that exchange is invaluable. On challenging projects, there’s a real sense of camaraderie that comes from being together.”

Chandler echoes that sentiment from a personal perspective. “For my mental health, I really value being around people. Working fully remote would be difficult for me.”

Ramani sees the impact in small, everyday moments. “I love the spontaneous hallway encounters; sometimes just bumping into a colleague leads to a quick conversation that resolves a challenge on the spot.”

The social culture reinforces those connections. Staff join art clubs, volleyball groups, foodie communities, Inktober challenges, and a wide range of employee resource groups. As Ramani puts it, “The clubs at ILM are definitely a highlight for me. We have a book club, a Pride ERG, a fashion club – there’s something for everyone, and it’s a joy to watch that community expand. It’s wonderful to see our diverse interests celebrated and getting to know my teammates through the things we love outside of our day jobs.”

For Wu, those communities also create everyday moments of creative exchange. “It’s really fun seeing colleagues share their drawings every day during projects like Inktober.”

Culture at ILM Vancouver does not live in policy documents. It lives in behaviour.

Benning chats with Riya Ramani (Credit: David Dovell & ILM).

Work and Innovation

Vancouver contributes to some of ILM’s most complex and ambitious projects. Artists describe an environment where technical advancement grows directly out of collaboration between departments and disciplines.

The studio is one of  ILM’s five global studios, with work frequently moving between sites as projects evolve. That kind of collaboration demands clarity, trust, and a shared technical language. Vancouver’s location on the Pacific coast places it in close alignment with west coast production while remaining deeply connected to each of the other ILM studios.

Matt Leonard offers a concise summary of the studio’s approach to problem-solving. “When a client has an impossible problem to solve, they often come to us. And I’ve never heard anyone here say, ‘We can’t do that.’”

Taplin points to a recent example where that mindset became tangible. “On Percy Jackson and the Olympians, we were being asked to move fast,” he says. “That meant building things locally, including building an ILM StageCraft LED volume and virtual production team, so the creative work could keep evolving. We were able to tap into the expertise from across ILM as a whole and create something new for our team here.”

He sees that approach as both an ILM hallmark and something the Vancouver studio has fully embraced: Drawing on the wider global company while remaining agile enough to respond quickly as new challenges emerge. That mindset plays out through repeated cycles of iteration. Shots evolve through multiple versions. Tools are reshaped and rewritten in response to real production demands. Chandler recently saw how that same approach shaped the work on Avatar: Fire and Ash (2025). “We developed several new tools that allowed us to work much faster and saved animators from having to do things manually.”

From the production side, Wu sees innovation supported by communication and trust. “My job is to make sure people feel supported and that when work moves between departments, communication is clear.”

Innovation at ILM is rarely about sudden breakthroughs. It is about a steady accumulation. Small improvements layered over time. Systems shaped by people solving real, creative problems at scale.

Benning and William Wu (Credit: David Dovell & ILM).

Belonging to a Larger Story

Artists and production staff in Vancouver describe a strong sense of belonging to something bigger than any single show. They recognize both their individual contributions and their place within ILM’s wider history.

Ramani appreciates that the studio formally recognizes the work of every department. “It’s so rewarding to see ILM include the studio support teams in the credits. It reinforces the idea that no project is the result of just one department; it takes an entire community to reach the finish line.”

Leonard notes how quickly new employees begin to feel connected to that legacy. “Very quickly you start to feel like you’re part of something bigger, something that has a real legacy behind it.”

For Taplin, that sense of continuity is essential. “When you look at all the industry pioneers that are at ILM, all of these people that everyone looks up to started as juniors. They were given opportunities, allowed to try things, allowed to fail, and to build over time. It’s important that people here know they can follow that same trajectory. That this can be a place where you build a career, not just move from project to project.”

Wu became aware of the ILM way almost immediately. “Everyone I spoke to before joining said ILM was the best place to be. And once you’re here, you really understand why.”

Careers at ILM often unfold over many years, sometimes with people leaving and returning, carrying new skills back into the studio. That flow of experience continually refreshes the culture while preserving its core identity.

Wesley Chandler gestures to a familiar Star Wars character as Benning listens (Credit: David Dovell & ILM).

Looking to the Future

The Vancouver studio is shaped by its people, influenced by its environment, and grounded in a culture of shared learning and collaboration. Artists and staff describe a studio where support is real, questions are encouraged, failure is a part of reaching success, innovation grows from teamwork, and ILM’s long history remains a living part of everyday work.

Taplin sees Vancouver playing an increasingly important role in the studio’s future. “There’s so much change happening in the industry. We need to be at the front of that. The question for us is always what Vancouver can bring to the table that serves the wider studio, while also pushing something new forward.”

He is also clear about the importance of acknowledging every department. “I want to recognize all of the teams that contribute to what we do in Vancouver. People come in every day trying to make things a little bit better, to try something new, and to put ideas forward with the wider team in mind. It’s a huge lift that everyone does, and it’s what makes this a special place to be.”

The values that shaped ILM in its earliest years are clearly still present here. Today, those values are expressed through hybrid workflows, global collaboration, and evolving technology. Looking forward, they will be carried by the next generation of artists, coordinators, trainers, and managers who will shape whatever ILM becomes next.

In a city known for its natural beauty, diverse communities, and deep connection to filmmaking, ILM’s Vancouver studio continues to expand the studio’s legacy across film, television, and emerging formats. It remains a place where people can build careers, push technology forward, and contribute to stories told around the world.

ILM’s Vancouver studio is located on the traditional, ancestral, and unceded territories of the Coast Salish Peoples, including the xʷməθkwəy̓əm (Musqueam), Skwxwú7mesh (Squamish), and Səl̓ílwətaʔ/Selilwitulh (Tsleil-Waututh) Nations. We thank all First Nations who have lived and worked on these territories from time immemorial.


Jamie Benning is a filmmaker, author, and podcaster with a lifelong passion for sci-fi and fantasy cinema. He hosts The Filmumentaries Podcast, featuring twice-monthly interviews with behind-the-scenes artists. Visit Filmumentaries.com or find him on X (@jamieswb) and @filmumentaries on Threads, Instagram, Facebook, and YouTube.

ILM visual effects supervisor Vincent Papaix and Nerfstudio creator Matt Tancik discuss their innovative approach to visual effects shot design.

By Lucas O. Seastrom

At this year’s HPA (Hollywood Professional Association) Awards for Technology & Innovation, Industrial Light & Magic and collaborator Nerfstudio took home a win in the Innovation in VFX, Virtual Production & Animation category. Embracing a new kind of open source toolset allowed ILM to recreate visual effects shots for Marvel’s 2025 series Ironheart at a degree of efficiency that greatly outpaced established techniques. The key was “NeRF,” or neural radiance fields, a method that allows 3D photorealistic environments to be created from a sampling of real-world 2D photography.

(Credit: ILM & Marvel).

A Catalyst from Marvel’s Ironheart

ILM visual effects supervisor Vincent Papaix faced an interesting challenge with a handful of drone-based shots from Ironheart, wherein the series’ namesake flies over Chicago’s lakeside waterfront and river district. The fast-flying CG character had to be integrated with the live action plates shot on location. “They decided to film with the drones in a very slow way, thinking we could retime the footage,” Papaix explains to ILM.com. “Typically, you might retime at 200% or 300%, but in this case it was over 1,000%. The character’s movement needed to be very, very fast. Traffic would have to be replaced. When you’re filming at normal speed with the drone, you don’t get the sense of the micro-movement, but at high speed, you could see the high-frequency movements of the camera.”

The visual effects team needed to recreate the desired camera moves while maintaining the appropriate view of the live action background plate. Normally, they might attempt a 2D stabilization of the image, but in a case like this, the sense of depth, or parallax, made for difficulties in trying to stabilize both the foreground and the background at the same time. They considered recreating the entire world in CG, traditionally modeling, texturing, and shading every detailed aspect of the Chicago setting. But with an episodic production schedule, the necessary resources and time required would be prohibitive. 

Papaix decided to begin what he describes as a “pet project,” researching how NeRF models could be applied to visual effects work. At first there was no guarantee that his inquiries would yield results, but then he discovered Nerfstudio, an open source program that provided an end-to-end workflow for developing 3D environments from 2D photography. 

Nerfstudio creator Matt Tancik began his research in developing neural radiance fields as a PhD student at the University of California, Berkeley. “People wanted to experiment and see how much they could push this technology,” Tancik says. “It became obvious that there was a desire for this research to make it into the industry field. But there wasn’t an easy way to do it because it was kind of obtuse research code at the time. The Nerfstudio project was about trying to see how we could wrap it up into something that looked more like a product, and fully open source, so that other people could start playing with it. 

“And most notably,” Tancik adds, “people could help build upon it. A lot of the research projects that we saw coming out of NeRF acted like modules attached to NeRF to make it better along one axis or another. It made sense to try to collaborate as much as possible. The Nerfstudio project was a step towards doing that, and that’s when Vincent and ILM started playing around with it.”

(Credit: ILM & Marvel).

The Function of “NeRFs”

But how exactly do neural radiance fields help empower artists like Papaix and his colleagues to work more efficiently? As Tancik explains, it’s a process that seeks to forego the traditional CG methods that involve the complex, often laborious craft of representing photorealistic imagery as meshes and triangles with applied textures and lighting. “All of that takes a significant amount of effort to make it photoreal, and in some cases, it’s almost impossible,” says Tancik. “That’s not for the lack of people trying to make these methods easier and easier. The goal of NeRF was to essentially see if we could use machine learning to accomplish the same thing. Instead of manually placing these triangles, can we have an algorithm construct these things from photos? So then the work becomes capturing many photos of a scene and converting them into a 3D representation.”

The result is a new approach to storing the corresponding data. Instead of triangles mapped within the CG model, NeRF uses individual points in space, each assigned a specific color. “When you look out into space, you’re shooting out into the scene and seeing what points you hit, and you’re noting which direction you’re hitting that point of space,” Tancik notes. “A single point in space, whether I’m looking at it one way or another, might look a little different. By describing the scene like this, it fits really nicely into optimization techniques that we can use to fit that to an image.”

ILM’s Practical Application

Working with former ILM research engineer Sirak Ghebremusse and former ILM pipeline technical director Kevin Rakes, Papaix oversaw the effort to adapt Nerfstudio’s functionality for visual effects. Both a new encoder and decoder were required to help translate information between Nerfstudio and ILM’s other tools, which ensured the team’s ability to maintain a certain amount of precision with color and image range. 

Similarly, the team needed to process the environments into real-world imagery that could be measured in feet, so Tancik himself created a new file format to aid the transition. That also required the development of new “gizmos” – a group of various nodes of information – within the compositing software Nuke, which allowed the artists to move seamlessly back and forth between the Nerfstudio render and the final effects work. 

“We can work with standard layout and animation in feet, then go into NeRF, import any camera we want, render it through Nerfstudio, and bring that camera move back with us into the Maya or Zeno file,” Papaix notes. “It was key to have that ability.”

(Credit: ILM & Marvel).

As the process evolved, ILM was able to apply these new capabilities in multiple ways. They could stitch a seamless transition between two separate camera views over the water, one captured by a drone and another from a boat, all without the need to create a new CG environment. Entire objects, such as street traffic on a bridge, could be removed. And because they were able to maintain parity between their visual effects environment and the Nerfstudio-rendered space, they could develop entirely new camera paths at the request of the filmmakers.

“We could create a new smooth camera move, basically art direct the exact move that we wanted, and then show that to the director,” Papaix says. “They were very happy. They didn’t think it was possible to change a camera move using an original plate, but we did. They said it was like magic to them. People were curious. Did we project the plate onto geometry? Did we model the whole city? No, there’s no modeling or anything.”

Now with Greater Accessibility

Papaix is keen to note that at the time ILM collaborated with Nerfstudio – in 2022 and ‘23 – these methods were still considered experimental. “Very few people were putting this kind of stuff into production. There was a lot of research taking place, but Matt showed how this could be useful, and ILM took it and made it production-ready.”

Tancik himself adds that “I’ve always been interested in the visual side of things, and hoped to get to that point, but didn’t know if the concept would ever actually make it there. It was not an easy thing to run. You needed a lot of computing power and GPUs. It didn’t feel like it was there yet to be useful in industry or productions. So watching Vincent and ILM put it into practice was really fun.”

Today, the use of neural radiance fields, as well as another similar outgrowth method known as Gaussian splat, is continually on the rise with increasing efficiency in computing power. “This was a science paper a few years ago, and now it’s making its way into all of the software that we use,” Papaix says.

“With the move to Gaussian splat, if I had to do those shots today, I could probably do it from start to finish in only a few days, compared to the months that it took before,” Papaix concludes. “At the time, it took about six months because it was more of a research project off and on, a side project. Now that we understand the tech, we can optimize things, and we can do things much faster. The tech improves so fast. We’re still in the early days of learning how these techniques will be applied.”

Watch the full demonstration reel:

Click here to read more about Nerfstudio.

See the full list of winners from the HPA Awards for Innovation & Technology.

Lucas O. Seastrom is the editor of ILM.com and Skysound.com, as well as a contributing writer and historian for Lucasfilm.

Visual effects artists from across Industrial Light & Magic are recognized for their work.

Members of the visual effects team after receiving their Emmy Award.

At the 4th Annual Children’s & Family Emmy Awards on March 2, 2026, Industrial Light & Magic’s team from Star Wars: Skeleton Crew won Outstanding Visual Effects for a Live Action Program. Additionally, ILM senior animator James Saunders was recognized for Outstanding Individual Achievement in Animation for Ultraman: Rising.

The recipients for Skeleton Crew included production visual effects supervisor John Knoll, production visual effects producers Abbigail Keller and Pablo Molles, animation supervisor Shawn Kelly, visual effects producer Nicole Matteson, virtual production supervisor Christopher Balog, and visual effects supervisors Jeff Capogreco, Bobo Skipper, Andy Walker, Joseph Kasparian, and Eddie Pasquarello.

ILM’s team joined fellow Skeleton Crew recipients from across Lucasfilm, including wins for Outstanding Young Teen Series, Outstanding Editing for a Young Teen Live Action Program, and Outstanding Sound Mixing and Sound Editing for a Live Action Program.

Congratulations to all of our Emmy winners! Read the full list here.

Senior animator James Saunders wins for Ultraman: Rising.

Read more about Star Wars: Skeleton Crew and Ultraman: Rising here on ILM.com:

‘Star Wars: Skeleton Crew’: ILM’s Visual Effects Treasure Chest, From At Attin to Starport Borgo

‘Star Wars: Skeleton Crew’: ILM’s Visual Effects Adventure from the Observatory Moon to Lanupa and back to At Attin

Real-Time Visual Effects: Behind-the-Scenes of ILM’s Cutting-Edge Contributions to ‘Star Wars: Skeleton Crew’

Netflix’s ‘Ultraman: Rising’: Building new worlds for the ultimate icon

ILM Evolutions: Animation, ‘Ultraman: Rising’ and ‘Transformers One’

Visual effects supervisors Charmaine Chan, Andrew Roberts, and Simone Coco share their experiences working together on the Oscar-nominated Jurassic World Rebirth.

By Amy Richau

Bringing dinosaurs to the screen for Jurassic World Rebirth (2025) required a true team effort with multiple ILM visual effects supervisors collaborating in teams around the world. While some films only require one visual effects supervisor to see the production through from start to finish, other films are just – bigger. Backing up production visual effects supervisor David Vickery, who recently talked about his work on Rebirth with ILM, were multiple visual effects supervisors from ILM, as well as others from partner studios like Midas VFX and ILP.

Charmaine Chan, Andrew Roberts, and Simone Coco talked with ILM.com about wrangling a herd of dinosaurs (both familiar and new to audiences), coordinating their individual teams’ work into one cohesive film, and the pressure of working on such a legendary franchise.

(Credit: ILM & Universal).

Getting the call

Working on an installment in the Jurassic film series was a full-circle moment for Chan, Roberts, and Coco, who all pointed to the original Jurassic Park (1993) as a moment that kick-started their careers in visual effects.

In the 1990s, while still working in the games industry, Roberts attended a talk from Jurassic Park’s CG supervisor Stefen Fangmeier in London. Hearing Fangmeier break down the work on Jurassic helped Roberts make the connection from his current work to a potential future in visual effects. “Seeing the same techniques of modeling, animation, and compositing that we were using in the games industry was the initial spark,” Roberts tells ILM.com. “That was an inflection point for me, where I started to pursue working in TV and film. The movie, as well as understanding the work that went into it, completely changed my life and my career and was the reason that I started to pursue computer graphics.”

The scene that stood out to Coco the most from the original film involved the iconic Tyrannosaurus rex. “It was so real and scary,” he notes. “I remember the T. rex screaming in the rain and shaking the glass and everything in the car.” The realism of that scene inspired Coco to better understand how the visual effects were created in other scenes in the film, eventually leading to his work at ILM, starting with projects like Napoleon (2023) and Mission: Impossible Dead Reckoning Part One (2023).

Chan was growing up in Hawaii, close to where Jurassic Park was filmed, when it was released. “I remember thinking, ‘Oh my god, dinosaurs could be there.’ I was a kid, and it just felt so real to me.” After joining ILM, Chan worked on Star Wars: The Last Jedi (2017), The Mandalorian (2019-23), and The Creator (2023) before joining the Rebirth crew, where she attended a special ILM screening of Jurassic Park. “It still stands up,” notes Chan, “that sense of awe and amazement and seeing the dinosaurs for the first time. And for me, it’s about wanting to recreate that feeling.”

While Rebirth was Roberts’s first Jurassic project, he had recently worked closely with director Gareth Edwards on The Creator. But even with that experience, Rebirth provided a “pinch-me” moment for him. “It was a little daunting, just seeing the quality of work and the deep history that ILM has with this franchise,” remembers Roberts. “So it was daunting, but very exciting. And I was definitely up for the challenge.”

The original Jurassic Park from 1993 (Credit: ILM & Universal).

Supervisor 101

The role of a visual effects supervisor can vary from film to film. Chan describes the role as that of both a mediator and translator, as well as the person to whom crew members come to with questions. “You see the big picture of everything and have such a huge overview of what’s going on that you can basically connect the dots that are needed for each department and each person within your team,” says Chan.

Coco points to being on set as an important part of the journey to reaching this role. “You start to see how the set works and how things develop from script to bidding to how we’re going to shoot this once getting on set.”

“In some ways, we’re here to facilitate the visual direction,” notes Chan. “Whether that be from the director or from our production visual effects supervisor, we make sure everyone is on the same page of what that visual need is. A lot of it is just working with people on a daily basis, reviewing their work and seeing that everyone’s moving in the same direction.”

The large number of visual effects shots in Rebirth (over 1,200) required splitting up the work throughout production and postproduction. Pulling off that many shots required constant communication between multiple departments and the visual effects supervisors, the latter of whom kept their focus on being creative problem solvers.

(Credit: ILM & Universal).

Designing the Dinosaurs

Chan was the first of the supervisors to join Rebirth in April of 2024, after dinosaur development at ILM had already begun. Figuring out how the dinosaurs would look and move on screen was a challenge they embraced through to the very last shot of the film. “We were constantly trying to make them the scariest, coolest, most fun dinosaurs we could,” says Chan. “We wanted something different from the previous worlds that we’d seen, something that honored some of the original Jurassic Park dinosaurs. But also, Gareth gave his own twist and turn to the design of them.”

Roberts, who joined Rebirth’s team last September, notes the jump between seeing skeletons of a dinosaur in a museum to thinking about how the creature’s joints would move in different environments. Before joining the film, he rewatched previous Jurassic films to get “familiar with the quality of work in all of them, how some of the creatures moved, and conveying the sense of weight for some of the bigger creatures.”

Gareth Edwards was heavily involved throughout the process of deciding how the dinosaurs would look in the film. “I think at one point we had a two-hour live session with Gareth trying to figure out what the Mutadon was going to look like,” remembers Chan, where one of the team’s modelers would try putting different pieces of real dinosaurs onto a Mutadon sculpture to piece it together. “I think that was vital to the process of making sure that our dinosaurs, from their basic stance, without even being in a shot, could stand by themselves and look cool. Once they were at the state that both David and Gareth were happy with, we would place them into a shot.”

Finding real-world animal references for each dinosaur was a key part of making the movements of dinosaurs in Rebirth appear believable and anchored in reality. To create Dolores, the small Aquilops dinosaur that Isabella Delgado (Audrina Miranda) adopts as her pet, an ILM team, led by animation supervisor Delio Tramontozzi, used videos of themselves interacting with their own pet dogs and cats. “They would have multiple takes of the way their pets were responding to a laser light or picking them up in a way that allowed them to snuggle into the crook of an arm or drape over a shoulder,” says Roberts. The reference videos were submitted with animation of Dolores or other dinosaurs so Roberts and other team members could see how those real-life moments translated to animated shots in Rebirth.

As Vickery was usually the only effects supervisor on set, he made sure to communicate what he and Edwards were looking for as far as dinosaur movements and behavior in different scenes. For the scenes in the tunnels when the Mutadon dinosaur pursued several characters from the film, Vickery took on the role of a dinosaur squeezing into the tunnel and picking itself up after landing on the floor. “There’s a moment where it plants its hands on the floor, leans forward with real weight, and roars before charging,” remembers Roberts. “And for a lot of that, David or [animation supervisor] Steve [Aplin] would act out to really convey the emotion they wanted. I think we really benefited from that. We’re all very comfortable with each other and locked in and just really enthusiastic about getting that character into the creatures.”

For another scene near the beginning of the film where a hybrid dinosaur almost caresses a lab worker with its claw before killing him, an animator was filmed holding a water bottle, looking at it, sniffing it, giving it a quick touch, and then snatching it. Notes Roberts, “that was a wonderful, fun performance from our animators, where they were able to get a bit more emotion into the scene from their own performance, which then was applied to some of the hybrid creatures.”

(Credit: ILM & Universal).

Dividing and Conquering

Different ILM supervisors took lead roles for each major sequence in the film. Chan’s team took on many of the water-heavy sequences featuring the Mosasaurus and the Spinosaurus, as well as the team that developed the Distortus Rex. Coco worked on the Mutadon sequences in the market and the tunnels as well as the T-rex chase sequence on the river, while Roberts tackled the beginning and ending of the film, as well as the cliff sequence featuring the Quetzalcoatlus.

Coco noted that splitting up the work into sections was helpful to their teams, so animators or compositors could go to one supervisor to ask a question, instead of having to approach multiple people to get the information they needed. Daily communication between supervisors and their teams of artists was also key throughout the production, as the team involved hundreds of people working in London, San Francisco, Vancouver, and Mumbai.

“It was very important for us all to hear what Gareth’s feedback was,” says Chan. “Because some feedback given on one dinosaur would also apply to another dinosaur in another sequence. And even though we were different teams, it was vital for us to still be sharing information about how we approach winged creatures or creatures in water — there were a lot of tips and tricks that we shared with one another.”

A library of shared assets documenting the workflow, along with an internal website, allowed everyone to understand what visual effects setups were established and ready to use and what they would need to create from scratch. This was especially helpful to Roberts and Coco, who joined the production after Chan. “A big part is sharing the tools up front to be on the same page about how we’re going to tackle things,” notes Roberts. “And then we have a number of chat groups for supervisors, as well as weekly meetings for each sequence and discipline.” Coco adds, “It was good to see what Gareth was looking for in a shot, or what was important for him in a particular environment, so I could follow that line.”

In one case, Roberts referenced the texture and amount of light in the sky in a night sequence at a gas station that the ILM team in London had worked on. That helped him to prepare a night scene his team had coming up. “We inherited that established look as a mood board of London’s work, allowing our team to match it seamlessly from the start,” notes Roberts, “so that when our team came on, we could say ‘we’re matching that.’ This is something that Gareth has already established. He likes this language for night, so we didn’t have to rediscover or explore that too much. So, without ego, just sharing and following, taking London’s lead where they were ahead, and then we also presented some of our work when we were ahead, or when it was on us to sort of establish a look. Very open communication made it a success and made it feel like it’s one team doing all the work together.”

Chat groups would also give supervisors an easy way to ask each other questions about how they might solve similar problems, especially in sequences where there was a bit of overlap between supervisors. To help with the time difference between London and San Francisco, Roberts and his team started their day early to increase the time the two teams were actively working.Another vital piece of the ILM crew on Rebirth was the production team – visual effects producers and production managers – who made sure supervisor teams were properly staffed, flagged important deadlines, and blocked off time for teams who needed to develop a new technique or tool.

(Credit: ILM & Universal).

Putting it All Together

The challenges Jurassic World Rebirth presented for its visual effects supervisors were varied, ranging from dinosaurs interacting with simulated water, designing environments from multiple elements, and satisfying a director well-versed in visual effects.

Coco’s team tackled the effects-heavy, intense action sequence where the Delgado family is chased by a just-awakened T-rex. While the river in the film is on a tropical island near the equator, these scenes were filmed at a British Olympic river course. “The T-rex interacting with the water, the digitally simulated water, and the family. It was a big, big moment,” notes Coco. “I don’t think a couple of years ago we would be able to do it because of the turnaround time needed. We had amazing effects artists who turned around the simulated water effects in record time.”

The Quetzalcoatlus sequence, when Zora Bennett (Scarlet Johansson) and other members of her team climb down a cliff to retrieve a sample from an egg, had its own unique challenges – and not all dinosaur-related. The cliff and cave environment was put together from a mix of elements, including footage shot at the cave set at Shepperton Studios in England, footage shot at Jog Falls in India, and millions of gallons of digitally simulated water. Mixing footage shot on location, wider shots that were fully CG, and digital extensions on top of drone work became a bit of a puzzle for Roberts’s team to make into one coherent environment. Another important part of this process was getting the right balance, wherein the background isn’t pulling too much focus from the actors. “Even though it’s multiple elements and different sections, you want to create a continuous environment where the audience truly believes the actors are immersed in that backdrop.”

Other shots not involving dinosaurs also occasionally proved tricky to get Edward’s sign-off on, in part because of his knowledge and appreciation of visual effects.

“Gareth has such a particular eye for blue screens that he can tell when a shot is a blue screen shot,” says Chan, “and for him, it’s successful when he can’t tell it’s a blue screen shot. So we are constantly trying to blend in, change lighting, include more atmospheric lens details, just so many little details that most people, when you think of just green screen or blue screen shots, wouldn’t even consider. Because Gareth wanted to make sure it never felt like a blue screen shot.”

Landing on the right scale for the dinosaurs was also an ongoing process for the visual effects supervisors and Edwards. “We’ve created these dinosaurs at a certain height and size,” notes Chan. “We put them in the shots the way they should naturally be at that size and height. And Gareth would look at some shots and say, ‘No, it doesn’t feel big enough.’ So we played this constant game of make it bigger, make it bigger, okay, that’s too big.

“One thing that Gareth just absolutely excelled at is scale and suspense,” Chan continues. “He knows how to compose every shot and frame to give you that sense. So to him, it’s less about the continuity and making sure things physically and scientifically look correct. It’s more about what makes the audience sit and look at something and feel that suspense. And so we worked with our animation team through many, many iterations of trying to figure out compositionally, what is the scale that works best for these shots?”

After months of hard work from teams across the world, the final product came together for the film’s release in July of 2025, giving both audiences and Rebirth’s crew an adventure to remember. “I think, every person who worked on the movie, and everyone that I talked to, they always said it’s been a dream to work on it, because it is such an iconic movie,” says Coco. “And in many cases, they started in visual effects because of Jurassic, so they don’t do it just because of the work, but because they love it. And working on such a big and iconic movie, they put their heart into it.”

Director Gareth Edwards on location (Credit: Universal).


Amy Richau is a freelance writer and editor with a background in film preservation. She’s the author of several pop culture reference books including Star Wars Timelines, LEGO Marvel Visual Dictionary, and Star Wars: The Phantom Menace: A Visual Archive. She is also the founder of the 365 Star Wars Women Project – that includes over 90 interviews with women who have worked on Star Wars productions. Find her on Bluesky or Instagram.

Teams from Andor, Sinners, and Avatar: Fire and Ash were recognized at the ceremony in Los Angeles.

The 24th annual awards ceremony for the Visual Effects Society was held on February 25 at the Beverly Hilton in Southern California, and teams from Industrial Light & Magic have earned three wins.

Avatar: Fire and Ash won Outstanding Environment in a Photoreal Feature for the Bridgehead Industrial City. ILM’s winners included Gianluca Pizzaia, Steve Bevins, Dziga Kaiser, and Zsolt Máté.

(Credit: ILM & 20th Century Studios).

ILM’s John O’Connell, Falk Boje, Hasan Ilhan, and Kevin George won for Outstanding Environment in an Episodic, Commercial, Game Cinematic, or Real-time Project for the Senate District in the episode “Welcome to the Rebellion” from Andor Season 2.

(Credit: ILM & Lucasfilm).

The feature film Sinners was recognized for Outstanding Supporting Visual Effects in a Photoreal Feature, and ILM’s Nick Marshall joined fellow winners Michael Ralla, James Alexander, Espen Nordahl, and Donnie Dean.

(Credit: ILM & Warner Bros.)

Congratulations to all of our ILM VES Awards winners, as well as to our Lucasfilm colleagues, who also took home the win for Outstanding Special (Practical) Effects in a Photoreal Project for their work on Andor.

Read the full list of winners.

Read more about Andor and Sinners here on ILM.com:

Snakes, Trains, and Automobiles: ILM’s Nick Marshall Sheds Light on the Visual Effects of ‘Sinners’

“Like Eating an Elephant One Bite at a Time”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

“Let the Experts Be the Experts”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

Assembling a Starfighter: Exploring ILM’s Role in Creating the TIE Avenger from ‘Andor’



ILM artists Ian Milham and Shannon Thomas take us behind the scenes of in the second of a two-part story about the 2024 Summer Olympics in Paris and 2026 Winter Olympics in Milan.

By Lucas O. Seastrom

Read Part 1 of this story here on ILM.com.

After ILM successfully created a landmark mobile deployment of its StageCraft virtual production system for the 2024 Paris Olympics coverage, it was only natural to up the ante for the next round. As a continuing partnership with NBC, the 2026 Milan Cortina Winter Olympics in Italy presented a batch of creative challenges that introduced a new level of dynamic presentation to ILM’s imagery. This included everything from shooting with a real ice rink in the foreground to simulating continuous motion on the volume’s LED wall.

“It was like the band got back together with this one,” notes virtual art department (VAD) supervisor Shannon Thomas. “It was the same crew on the client side. This time the process was less educational in terms of how we interacted with the client or explained how best to use the LED volume. They now understood how it would work, so this was about executing the same kind of creative process but with all new sets. Right from the start, we were able to get down to the specifics of what they wanted to achieve.”

Milan Cortina: The Brief

As with Paris, ILM’s task was to create a series of LED volume loads that acted as backdrops for athletes from the United States Olympic team. Footage of the athletes striking poses and performing actions in each setting would be utilized for any number of promotional needs before and during the Games. NBC brought six ideas for locations to the ILM team, and together they brought new layers of complexity to the job of realizing each of them.

A majority of the settings featured natural, snowy landscapes of the Italian Dolomites, including a mountain top, a frozen pond in a small valley, a ski lodge that looks out across an alpine vista, and a “moving” chairlift up a mountainside. Additionally, two Milan locations incorporated local landmarks, the Piazza del Duomo with its namesake cathedral and lustrous Galleria, and the interior of Teatro alla Scala, Milan’s 18th century opera house, which included an ice rink on the stage.

Initial challenges included the need to accommodate a massive scale of mountain landscapes within the visible scope of an LED volume wall. There was also the active motion of the athletes, a key difference from Paris. Ice skaters and hockey players would actually be moving across the small ice rinks built as practical sets in front of the volume wall. Not only did ILM’s technology need to coexist alongside the ice, but the camera needed to track effectively with the talent’s movements while also maintaining the appropriate sense of depth with the background.

As with Paris, the stylistic brief was to create a sense of hyperrealism, with accentuated lighting and staging. But the natural settings required a new level of realism to feel believable, compared to the Paris sets which were more idealistic. ILM artist Giles Hancock joined the NBC team on a location visit to Italy in early 2025. “Giles captured a ton of photo and video reference plus LIDAR scans of the actual locations,” says ILM virtual production supervisor Ian Milham. “We could then use that as the basis for the content that we made for the wall. That involved someone staging a camera tripod in different spaces, and even riding a chairlift, as well as some data from a helicopter flight.”

Setting the Scene

Both ILM’s past experience and its established relationship with NBC empowered the team’s ability to begin planning from a very early stage. “We did quite a bit of animated previsualization,” says Milham. “The athletes were actually performing their sport, and there were concerns about space and safety, of course, but we also needed to make sure that we could follow and track them in a way that did justice to their real movements.”

Understanding the parameters for the shoot, what Milham calls “the edges of what’s possible,” ensures that the ILM team can deliver everything that could potentially be needed on the day. The client is likewise empowered to envision whatever they can within those set parameters, as well as plan the practical foreground elements. And, of course, there are times when both the client and ILM work to push the established boundaries and see how much more they can do.

For the digital backgrounds, individual settings required their own distinct finessing, in particular the Piazza del Duomo. “Lead hard surface modeler Masa Narita modeled all of the buildings for that entire city square from Giles’ scans, and texture artist Maria Cifuentes textured that entire set as it looked in real life in Milan,” Thomas explains, “and I then lit the real time set based on our digital sky select, and incorporated where the practical ice rink would have to live inside the volume. I then had the new challenge of determining where you would have to physically be, if you were in front of a camera in the real Piazza square, in order to also see the top of the Duomo’s tallest spire. We quickly realized that even with a wide focal length we needed to pivot in order to ensure we could see the whole building.”

Because the Duomo’s cathedral is over 600 years old and such an important national landmark, it was necessary to make sure that the building’s recognizable shape could be seen in its entirety. This resulted in a slight increase in the size of the LED volume from what we used for Paris. ILM’s R&D engineers and technical teams designed a virtual LED volume tool which allowed Thomas and his team to instantly add more rows or columns of panels, all adjustable in real time to ensure that the physical LED volume built on the day would capture the full height and beauty of the Duomo.

(Credit: ILM & NBC Sports).

For the opera house interior, Hancock’s photogrammetry data provided a useful foundation. Narita and Cifuentes again created a photoreal CG asset for the space, which Thomas and lead virtual production technical director Rey Reynolds then staged in appropriately gold-tinged warm light, including the illustrious digital chandelier. On the set, practical red curtains, some 30 feet tall, augmented the background. “They could open them like real stage curtains in camera to reveal our LED content,” Thomas explains, “so that it felt like you were on a real stage, and behind it was the virtual opera house, an ice skater on real ice, fake snow, and real movie lighting. It was so cool. It looked like magic. It’s a great example of using the tool exactly how it should be used, not forcing it.”

The ice itself was roughly 20 feet long by some 40 feet wide. A refrigeration unit was specially designed into the space. A fair amount of research and planning was involved in determining the necessary space required for the athletes to get up to speed and perform. “Hollywood is crazy in terms of what they can do onsite,” says Milham. “You have to keep the room cold, of course. It’s all internally cooled like a hockey arena. It’s logistically difficult in terms of how things are built. You have to install the wall first, then build the rink, then leave the rink for a couple days so it freezes.” Practical snow was also used on multiple sets, which, as Milham notes, “adds the magic of it falling onto people or the sense of depth as they travel through it.”

The most elaborate practical foreground set was for the patio deck at the ski lodge, which included an actual fire pit, furniture, stringlights, a surrounding fenceline, and even a small tree that needed to match with ILM’s CG counterparts. The effects team initially decided to create the background as a standard 2D matte element, but when the client planned more dynamic camera movements for the scene, ILM pivoted to an elaborate 3D environment. Senior VAD artist Nate Prop led its creation, which allowed the team to make specific changes to the mountain view as needed. And Thomas notes, “[VAD supervisor] Christy Page even added little cars driving in the town, so you could actually see little headlights moving out there. The lights flicker in the town as well.

“There’s enough of the practical elements to tie you into the background,” Thomas continues. “The way they shot it works really well with the real fire pit and cabin structure, fence, trees, etc. Though if you move the camera just a couple feet to the left or right, you see all of the structural wood boards of the practical set build, like it’s a high school play. But in-camera, the shot works, it looks like they’re really there.” Actor Scarlett Johansson and Olympian Lindsay Vonn would be among those to shoot on the lodge set.

(Credit: ILM & NBC Sports).

Adjusting the Frame Rate

A significant technical change for the Winter Olympics production was to capture footage at 48 frames per second (fps). The additional frame rate would allow the client to modify the imagery to varying speeds as needed, and in particular with athletes zipping about on the ice.

“The difficulty with that is when shooting with a StageCraft LED screen, you need very exact synchronization between your camera, the content, and the wall,” Milham explains. “So with a higher frame rate, that’s orders of magnitude harder because you have to sync to 1/48th of a second instead of 1/24th of a second. Your content needs to do all of its transformation in half the time and everything else with it.”

Without the proper synchronization, a number of issues can arise, including a flicker effect visible on the LED volume, as well as on the foreground subjects and elements because the wall acts as both a background and a light source. On the previous Summer Olympics production, the cameras ran at 48fps but the wall content projected at 24fps, which meant there were limitations to how the final footage could be adjusted after the fact. By request of NBC, Milan Cortina became ILM’s first all-48fps volume shoot.

ILM imaging supervisor J Schulte and principal engineer and architect Nick Rasmussen coordinated weeks of rigorous testing for the chosen Alexa 35 camera system. The results provided much greater flexibility, especially with the demands of the Winter Olympics settings in mind. “It’s important to have if you’re really flying the camera around, like a push-in or a big crane move,” says Milham, “and we did some gigantic crane moves on this project.”

As they had with the Paris shoot, ILM ran three separate renderers to allow for quick changes between scenes. But in this case, two projected to the LED wall at 48fps, while the third projected at 24fps for set-ups that involved audio capture. Interviews and related scenes with dialogue would not require dramatic adjustments in speed.

(Credit: ILM & NBC Sports).

Just Like the Old Days

Perhaps the most distinctive of the volume loads for the Winter Olympics involved a chairlift that appeared to be moving through the Dolomites. The scene was initially planned as a standard interview set-up, with three locked-off cameras positioned to cover two or three subjects in the chairlift. This would require a moving background on the LED wall that ran for an extended time.

“NBC didn’t want to cut, because they might intercut the footage with other material, and then cut back to it,” Thomas recalls. “They wanted to just roll and let the people talk and get comfortable. They might only use a piece of it three or four minutes in. That means you need to run a lot of footage, and it has to loop at about eight-minute intervals. When you think of a Star Wars movie or something like that, you might have around 2,000 shots that a team of hundreds executes over months or years. Most of those shots are a few seconds to maybe 20 seconds long. We had to render eight minutes of a straight looping footage in a fraction of the time with one or two artists. It was a ton of work and data to maintain.”

Nate Propp again took the lead, visualizing a system that placed two opposite rows of mountain landscape moving in parallel on either side of the talent, not unlike a conveyor belt. “It didn’t matter if the mountains weren’t the exact layout of the Dolomites. We used all of Giles’ scans to piece together a long track of digital mountains that felt like the Dolomites versus having this specific mountain peak perfectly line up to that one. Our earlier Paris Olympics set work aided us here, so as long as we captured the essence of the Dolomites, it worked,” Thomas notes. “The conversation with the talent was the real focus. So Nate laid it out and duplicated a mountain landscape like railroad tracks, that could loop, basically to infinity. We could then place the ski lift in with the camera and ride this imaginary track for as long as they wanted.”

It required a great deal of experimenting to then determine the best means to render such a lengthy scene at 4K without overloading the machines. Additionally, there were concerns about maintaining consistent lighting. “What can break that set-up is not necessarily the background, but rather the feeling that the people aren’t actually moving through it,” Milham says, “and that’s usually because the lighting on them isn’t changing enough.” Coordination with the on-set, practical lighting team allowed them to find the appropriate balance. The volume itself can be equipped to synchronize directly with a traditional lighting grid.

The chairlift set-up became even more elaborate when it was decided to actually move the formerly stationary cameras during the scene. This required ILM to create additional pieces of the landscape on the fly to fill in previously unseen gaps. “It’s all our system and our artists,” Milham notes, “so we’re able to do that at the last minute.” That included associate virtual production supervisor Brad Watkins, who partnered with Milham in directing the team on the stage.

The final results, as Thomas points out with a smile, “are the same as Alfred Hitchcock’s background of Mount Rushmore in North by Northwest. It’s the same magic trick they pulled off in 1959. But now you can move the camera and get parallax.”

The Art of Collaboration

During pre-production, the ILM crew demonstrated their various plans for the set-ups to NBC. They showed the chairlift scene last. “We had the initial presentation that showed what you’d actually see from each camera,” Thomas explains, “and then we showed what the footage was actually doing on the site, this incline into infinity. Ian is pitching it and explaining what can be accomplished. And one of the folks from NBC turns to his colleagues and says, ‘Guys, this is it! This is going to work!’ We were so happy that they liked it, in particular, because we knew how important this specific set was for their vision. “I’ve learned through the years that, with any client, you want to really listen in to what is important to them, and then hit that specific note to reinforce that you are a team working together to achieve a shared goal, this is what builds confidence with your client.”

Throughout these Olympics collaborations, the key for ILM has been an equal mix of flexibility and adaptability to meet the client’s needs. The continuous, energetic shooting style only further demonstrated the versatility of ILM Stagecraft, and likewise ILM’s ability to meet the needs of any client.

“There are ‘unlocks’ here in terms of what is possible with last-minute scenarios, or in-demand talent, in terms of pulling off an ambition that otherwise would not be possible,” Milham concludes. “It doesn’t have to be Star Wars. You can use this technology to make sets appear very, very fast, and to take advantage of a small window of time with talent, all without limitations, and we can do it anywhere.”

Lucas O. Seastrom is the editor of ILM.com and Skysound.com, as well as a contributing writer and historian for Lucasfilm.

ILM artists Ian Milham and Shannon Thomas take us behind the scenes of a breakthrough virtual production shoot in the first of a two-part story about the 2024 Summer Olympics and 2026 Winter Olympics.

By Lucas O. Seastrom

American viewers of the Milan Cortina Winter Olympics on NBC and Peacock have experienced a number of striking visual effects created by Industrial Light & Magic, whether they realize it or not.

Dynamic, promotional footage of American athletes in snowy Italian landscapes and on the ground in Milan are in fact all shot in front of an ILM StageCraft virtual production volume. The achievement is part of a continuing story of ILM’s work to broaden the applicability of its virtual production toolset. These latest Olympic games are in fact the second to be showcased in such a way. ILM also partnered with NBC for the earlier 2024 Summer Olympics in Paris, an undertaking that won multiple Emmy Awards.

(Credit: ILM & NBC Sports).

Paris: The Brief

The story of ILM and NBC’s collaboration for the Olympics actually begins in 2021 with a distinctly American sport: football. “As people were hearing about our virtual production work on The Mandalorian, we talked with lots of different groups and did some work with them, including with NBC Sports for Sunday Night Football,” recalls ILM virtual production supervisor Ian Milham.

With the need to capture singer Carrie Underwood performing in multiple environments and in quick succession, ILM deployed a version of its StageCraft volume, which provided greater flexibility than a traditional blue screen. It proved a meaningful exercise in developing a different kind of story for a client with different needs than a feature filmmaker. “The following year, NBC was exploring ways to level up their work,” Milham explains, “and they reached out and asked if these tools could be put to further use.”

What NBC proposed for the Summer Olympics in Paris was far more ambitious than the Sunday Night Football production. Dozens of athletes – from swimmers to gymnasts to javelin throwers – would be captured in multiple Paris locales at twilight: a street, a riverside, a rooftop, a fashion show-eqsue runway at the foot of the Eiffel Tower, and a virtual trip down the Seine River during the event’s opening ceremonies. The resulting footage would be adapted into short form clips used for promotional spots before and during the Games.

“NBC’s goal was to get a lot of footage in different contexts of all these athletes looking amazing in a world that is aesthetically heightened,” says Milham. “Along with that, the DP/director [Scott Duncan] wanted to continuously run the camera in order to keep things improvisational with the athletes. You have to shoot all the time and capture lots of different things. It’s not like a feature where you’re going to board and previs everything in advance. That was our biggest challenge to deliver on. The director wanted no rules in terms of flexibility with shooting and NBC wanted a large amount of usable footage.”

(Credit: ILM & NBC Sports).

A Different Kind of Volume

The Summer Olympics production was the debut of a new variation of ILM StageCraft. “We had invented this really cool thing that people wanted to use, but Star Wars was always using it,” says Milham. “So we needed to make another version that wasn’t limited to one place. It would be a huge advantage to bring StageCraft to the client. So we created a mobile system, which was deployed for the first time with the Paris Olympics.”

The volume itself can be adapted to any size, its “tiles” – LED panels – adjusted to the needed shape of a given set. Created by ILM’s virtual production team based in San Francisco, the entire infrastructure is built to move, “like a set-up for a rock concert,” as Milham puts it. For the Paris Olympics, a roughly 180-degree curved wall was constructed to a height of approximately 25 feet. This specific production involved the extensive use of foreground set pieces that needed to blend seamlessly with the virtual background, as did the elaborate practical lighting set-ups.

“StageCraft isn’t just one thing,” Milham adds. “It’s a lot of different tools and techniques. Sometimes we use a little of it or a lot of it, whatever is needed.”

(Credit: ILM & NBC Sports).

Putting the City of Lights on the Screen

Starting some nine months ahead of the actual shoot in November of 2023 was virtual art department (VAD) supervisor Shannon Thomas and a team of artists responsible for creating the settings visible on the volume’s tiles. A four-year ILM veteran with 20 years across the industry, Thomas brings experience from a number of different effects houses, including Rhythm & Hues and Weta FX.

He “came here for Star Wars,” as he puts it, reflecting on recent projects he contributed to like Star Wars: Skeleton Crew (2024-25) and The Mandalorian and Grogu (2026). “I came here to work on the volume and be involved in real-time virtual production, future-tech projects, and to get back into film work.” But Thomas admits with a smile that he is also a big Olympics fan and was happily surprised to join the team for Paris, it being his favorite city to boot.

NBC’s brief for the City of Lights was different from a usual feature film in that realistic accuracy was not essential. The ILM artists would not be required to match the layout or appearance of a specific location in Paris, but rather capture “the idea of Paris,” as Thomas notes. “That goes all the way down to what kind of chair we need to have in front of a café. If it feels like Paris, then we have it. It allowed us to work faster as well.”

“We’re not going with photo-realism,” Milham adds. “It’s in a style that’s more like a glamorous photo shoot, a larger than life situation.”

The team spent considerable time determining the best digital skies, ultimately landing on the right blend of pinks and blues during magic hour for each set. For the street setting, former senior VAD artist David Flamburis took ownership and, rather than evoke a specific neighborhood, they created a fictitious location full of Parisian charm, and with fantastical views of the Eiffel Tower. The iconic structure itself was also a subject of considerable study, in particular how best to light it. Existing Eiffel Tower assets from earlier ILM productions were useful for reference. Along the Seine River, ILM changed the water’s width and depth as their artistic needs demanded.

Initially developed with commercial real-time software, the environments were then ported to ILM’s proprietary Helios renderer for volume projection. New advancements allowed for enhanced rippling, refraction, and reflection in the river water, which was sometimes augmented on the live action set by practical techniques, including a small tub of water with shards of glass. It was all in service of what the team dubbed “hyper-realism.”

According to Thomas, the opening ceremony load was probably the most challenging to create. “Everyone knew this would involve various boats per each country’s team going down the Seine, which was a very cool idea. The big challenge was the crowds, which is always a tricky thing in real time. We had to figure out the logistics of how many boats, how many people, and those types of things. We had tight resources throughout the project so we had to work very closely together to determine how things needed to be depicted.

“Senior VAD artist Nate Propp came up with a very clever solution for this process here that allowed us to color coordinate the crowds, per country color in sections, as if they were fans per country peppered around the set,” Thomas continues. “The digital crowd also had controls for how much they would cheer, including waving flags, holding signs, etc. For the distance we’d see them from camera and we knew the trick would work.”

ILM created a roughly half-mile stretch of river that was necessarily fictitious in layout. To determine the best speed, Thomas actually contacted a Parisian Bateaux Mouches boat tour company to gather research. “I told them that my parents were planning this trip to Paris, and they wanted to go for a ride on the Seine, but they get really seasick,” he notes with a laugh. “How fast do they go? Is there a lot of motion? And the company wrote me back! About nine to 12 knots was the average speed. Then we could design the movement of the boats that way and it worked really well, as it’s always best to work from reality and adjust from there.”

(Credit: ILM & NBC Sports).

How to Shoot with Lots of People in Many Places Very Quickly

Compared to a typical day on the StageCraft volume set of The Mandalorian, ILM’s crew for the Summer Olympics had to capture roughly four times the amount of live footage. During a massive production that involved dozens of athletes moving between six different locations on the Universal Studios lot in Hollywood, ILM’s volume stage welcomed 58 individuals over a six-day shoot. Some athletes were only on hand for a matter of minutes, requiring an unprecedented level of flexibility to make quick changes. The ILM crew executed over 120 scene changes on the volume’s wall without any waiting time required.

“The on-set grips were the real heroes with all of those changes,” Milham notes. “They had to move physical sets in and out 120 times. The practical art department worked with us throughout that process.”

The key to ILM’s flexibility was dividing its rendering power into subgroups. Whereas a cinematic-scale production like Star Wars would utilize all of its rendering capacity into one volume load that would be utilized for hours at a time, the rapid pace of the Olympics shoot led the crew to devise a new solution. Three separate renderers, each with its ability to power the entire LED wall, were loaded with distinct settings. When the client requested a scene change, all the ILM team had to do was switch the feed over instantaneously.

“Scott Duncan is an amazing and inspiring person to work, with who films shows like Survivor,” Thomas says about the production’s combined director and cinematographer. “He’d make changes live on the set, and would just pick up the camera and want to shoot something. The stage team would have to keep up. It was a quick, iterative process, very freestyle, like indie filmmaking, which I love. They’d shoot and just keep the camera rolling. Where in a feature film you’re focusing on getting a whole take or a specific scene, in this case they’re looking for just a few seconds of something amazing that they can use to then stitch into their longer marketing narrative.”

(Credit: ILM & NBC Sports).

Real Time Revisions

Not only could ILM make rapid changes between entire set-ups, but they could even make live alterations with the CG background itself. When the lighting team incorporated the tub of water with shards of glass in the Seine River locale, the stylized, caustic light initially felt jarring, more like a swimming pool. So to help balance the effect, the ILM artists plussed the scene with additional lighting along the riverside.

“We added some bright white lights along the river in the background, just like the lights along the side of a pool,” says Milham. “It helped to fit the swimmer in the scene because you could imagine that one of those lights was right next to her. Do we really care that the Seine doesn’t have those lights? Not in this situation. We’re just trying to make it look awesome. That’s the artifice of it. It’s okay if it looks like a dream.”

Incorporating these details within the Helios environment would have taken a matter of minutes, and all while the scene was still loaded on the volume wall. “The moment is over if it takes hours, so we have to do it right there,” Milham adds. “I’ll be there on the radio calling in the changes and adjustments as the shoot is taking place.”

(Credit: ILM & NBC Sports).

A Special Guest Introduction

“I had gone up to the stage during the shoot, and everyone seemed really happy with how things were going,” remembers Thomas. “Then our producer, Shivani Jhaveri, just mentioned, ‘Oh, Steven was here yesterday and he loved it.’ I said, ‘Steven?’ and she’s like, ‘Spielberg!’ What?! Laughing, ‘How?’ It was an unexpected surprise to hear he was pleased with the work, what a blessing to have him involved.”

Not long before the shoot was set to begin. NBC arranged for director Steven Spielberg to film a special introduction for the Olympics on ILM’s volume set. The special moment required yet another new way of presenting a scene on the LED wall. Spielberg would walk on from the side, with the blank wall and its surroundings visible behind him, and then as he came to center stage, the Parisian riverside location would load.

“We had just shot The Fabelmans with him, and he understood the process, so I think he trusted that it would work well,” says Milham. “And because this clip involves Steven Spielberg, the ‘filmmaking’ of it all can be part of the story. So Steven began walking outside the volume, as if he were on a movie set, because he was, of course, and then we turned the environment on. It was a relatively unique use of the technology.”

Milham describes the required process as “relatively easy,” an extension of their existing multiload capacity. They simply closed the video feed for the riverside scene to make the wall appear blank, and then turned it on again at the right moment.

Spielberg himself likened the grand show that is the Olympic Games to a great story, something that felt close to home for the ILM team, as Shivani Jhaveri notes. “The theme that Spielberg talks about in his opening is so relevant to StageCraft,” she explains. “There’s a connection in that StageCraft is all about telling a story. It was all about telling the athlete’s stories, where they’ve come from, where they are now, and it was really special to see all that.”

“If something new is needed, we’ll invent it.”

The success of ILM’s work on the Paris Olympics project was thanks to a relatively small team, especially compared to a feature film or episodic series. Along with Milham, Thomas, and Jhaveri, some of the other leading crew members included lead virtual production technical director Rey Reynolds, CG supervisor Sam Wirch, capture supervisor Ted O’Brien, and lead operator Kelly Fan.

“One of the reasons that ILM has been around for 50 years is that we’re not married to the way things are,” says Milham. “If something new is needed, we’ll invent it. If something we’ve been doing forever needs to change, we’ll change it. We will adapt. Even in the short amount of time that this method of shooting has existed, we’ve completely transformed it. One of my favorite things has been working with all sorts of different filmmakers, storytellers, and clients who tell us, ‘That’s great, but it needs to do this…’ or ‘Have you ever thought about trying this?’ And we try it. That happens on every show.”

ILM’s Olympics story continues with the 2026 Winter Games in Milan, Italy. Watch ILM.com for a behind-the-scenes look at this production, which included brand-new innovations.

Lucas O. Seastrom is the editor of ILM.com and Skysound.com, as well as a contributing writer and historian for Lucasfilm.

ILM.com is showcasing artwork specially chosen by members of the ILM Art Department. In this installment of a continuing series, three artists from the San Francisco studio share insights about their work on the 2025 mixed reality playset from ILM and Lucasfilm, Star Wars: Beyond Victory.

Art Director Stephen Zavala

(Credit: ILM & Lucasfilm).

Beyond Victory required a hub where our characters would live and roam. That’s how the garage was born as an idea. It served as an HQ where the player could come back after starting and finishing every quest. Like all concept work, one needs to find their footing, and this is done by providing several ideas for a particular story beat or design need. Challenges arise once we see the space in VR, since spaces have a tendency to look smaller or larger than we originally imagined. Once we see the space on a virtual space it’s all about adjusting the space to a scale we’re comfortable with.

The director, Jose Perez III, really wanted a place in the middle of nowhere. I tried to capture that, but it was also important to make sure it didn’t look abandoned. It’s isolated, but lots of activity happens inside the garage as well as the surrounding areas. I always liked the idea that it was a hub where all kinds of visitors would come and go, either to fuel up or repair their speeders, and bringing with them all kinds of cool stories.

I wanted to design a place with a sense that it’s been lived in for quite some time. It wasn’t meant to be in disrepair, but instead have that sense of daily life and how it can be messier than we’d like to admit when it comes to managing our spaces. It certainly was satisfying how the garage slowly grew into that exact idea.

All art pieces come with challenges. When in doubt you reference, step back or subject your designs to peer review. A brief pause often provides time for introspection on how to adjust the course.

Senior Concept Artist Casey Straka

Volo is our main/player character, and there were a few different physical traits they had to have, storywise; mainly, they had to be on the smaller side for podracing prowess, and have four arms for some specific game mechanics they wanted to incorporate. Besides that, it was a very open brief. 

We considered multiple different species for Volo at first, lots of mulling over, lots of options that didn’t feel quite right. I proposed a Nikto, since they are such a varied species in the galaxy and have different subspecies and evolutionary traits depending on where they’re from. Maybe one subspecies evolved an extra pair of arms, which was a trait we needed. We landed there, but I don’t think we stayed there in the end, so I think Volo is something new altogether. I mostly took inspiration from previous Star Wars heroes! I wanted Volo to be very appealing, like you’d want to be their friend after you spend enough time following their story. I did a lot of additional drawings of Volo to find their mannerisms and expressions, the little things that make them, them

Volo’s outfit was fun to do, I love a good Star Wars jacket. They also have flexible spines on their head, and how those move according to emotions ended up being inspired by cats; they flatten to their head when scared, flare out when angry, droop a little when sad. 

In a very technical sense, a goal I set for myself was to hit a new benchmark in terms of skill; I learned a lot about my own process on this project. But one goal I always set for myself is to make a character people can get attached to. That’s always the most important part to me. I think all designs have their struggle points, some more than others. If something isn’t budging I try to take a walk. Concept art is a lot of problem solving, and getting distance from it to work something out can help.

Senior Concept Artist Evan Whitefield

(Credit: ILM & Lucasfilm).

The design draws inspiration from components of several different TIE fighters. Both engines are based on the TIE Bomber’s twin ion engine thrusters (ordnance pods), with TIE Interceptor wings mounted on each engine to give the vehicle a more aggressive silhouette. 

The cockpit functions as the control pod and was cut down and reworked to feel more dangerous, almost chariot-like in form. Additional elements, including the energy binder plates, rear thrusters, steel control cables, and air intakes, were carefully integrated to create a seamless fusion of podracer and TIE fighter design language.

This vehicle wasn’t originally planned. It emerged naturally as a concept I thought would be fun to play in-game. Early on, I imagined the original owner as a former Imperial who went rogue and turned to podracing, scavenging parts from Imperial fighters to construct what became known as the TIE Bolt. As the concept evolved, the final story became that the TIE Bolt was a custom podracer created as a gift for Imperial Admiral Rellen by Grakkus Jahibaki Tingi.

Star Wars: Beyond Victory – A Mixed Reality Playset is currently on sale on Meta Quest 3 and 3S headsets.

See the complete gallery of concept art and a design case study from Star Wars: Beyond Victory here on ILM.com.

Read more about Star Wars: Beyond Victory:

‘Star Wars: Beyond Victory’ Now Available and Director Jose Perez III Takes Us Behind the Scenes

Bobby Moynihan Takes Us Behind the Scenes of ‘Star Wars: Beyond Victory’

ILM’s innovative approach leads the way for more than 1,700 visual effects shots, helping bring Wright’s dystopian action thriller to life.

By Clayton Sandell


When director Edgar Wright was gearing up to make The Running Man (2025) and considered the extensive visual effects the story would require, he turned to a fellow filmmaker for advice.

“I’m friends with Gareth Edwards, and I was really taken with the work on The Creator,” Wright says in an interview with ILM.com. “Especially the idea of shooting on location and then designing the environments after the fact. I was really impressed by how the visual effects work was put into more naturalistic, grounded camerawork. I wanted to pick his brain about how exactly it was done.”

In The Running Man, a science-fiction thriller set in a near-future dystopian America, blue-collar worker Ben Richards (Glen Powell) desperately needs money to buy medicine for his baby daughter. He signs up with a TV network, hoping to compete on one of their game shows. He is picked for the most dangerous one: where contestants try to evade capture for 30 days in exchange for $1 billion.

After chatting with Edwards, Wright decided The Running Man should utilize the same unconventional approach that ILM brought to The Creator, winner of multiple awards for best visual effects, including from the Visual Effects Society.

Shooting on The Running Man began in early November 2024. With a release date rapidly approaching just a year later, the pressure to meet deadlines was on every department, including visual effects. Wright says he was happy the project reunited him with Academy Award-winning production visual effects supervisor Andrew Whitehurst. The two worked together on Wright’s 2010 film Scott Pilgrim vs. the World. “I remembered very fondly working with Andrew, so that was just an amazing, fortuitous bit of kismet,” Wright says.

The filmmaker credits Whitehurst and visual effects producer Sona Pak for preproduction planning, which kept everything on track. “Andrew and Sona were very clear on how to make this work, and how it would even be possible to turn around something this quickly with so many visual effects shots,” Wright recalls. “They had a very clear idea of what we were trying to achieve before we started shooting. What was really good was making decisions early on and sticking to them. I think where things can go awry – especially on a compressed schedule – is if you’re still working out what you want to do after you’ve finished filming.

“I’m frankly really amazed that we managed to do everything we did in time,” Wright adds.

(Credit: ILM & Paramount).

Ben Richards’ 30-day fight to survive begins in Co-Op City, with the journey taking him to New York and Boston. Exterior scenes were shot mostly in real locations in London, England and Glasgow, Scotland, as well as on practical and backlot locations in Sofia, Bulgaria.

“Most of the initial meetings and discussions were centered around the places we were thinking of shooting, and the things we were thinking of building,” Whitehurst explains. “When that started to solidify, it became much clearer who was actually going to do what, and what was physically buildable, and what wasn’t.”

Fans of Andor (2022-25) may notice that the Canary Wharf section of London makes an appearance in The Running Man, disguised by a number of digital enhancements. “We did have nicely filmed places where the majority was real, which is always a great starting point,” says ILM visual effects supervisor Dave Zaretti. “Then you’re extending upwards into the distance. You can change the sky a little bit, but it was based on truth and reality, and nicely chosen locations. The team had a blast.”

Another shot set outside the fictional network headquarters begins at the real entrance steps leading to Wembley Stadium, but then transitions to a completely CG skyscraper as the camera tilts up. “It’s very funny to me to take one of London’s most famous landmarks and digitally erase it from the movie,” Wright laughs. “That’s an example where we’re starting with a real shot of Glen Powell and all the extras walking up the steps, and then the camera just keeps going and going. That was really the methodology throughout. It was about keeping it grounded, because the perspective of the story is that you’re very much seeing it from Ben Richards’s viewpoint.”

Visual effects contributed significant digital building extensions, crowds, street signs, lampposts, traffic lights, and even flying mailboxes. Cars from the 1980s era were digitally augmented with designs that more closely fit the story’s futuristic aesthetic. “James Mohan and Ashley Pay deserve huge credit for taking on the lion’s share of world-building, from city extensions to augmented traffic lights, road markings, and uptown car augmentation,” Zaretti says.

For a rooftop sequence where Richards tries to escape from a Boston hotel, full CG city recreations were combined with live-action footage shot on a partial set against a green screen. Another scene that appears to be a single take is actually three, completed with digital seams. In Boston, Richards runs out of an apartment and down a hallway – dodging gunfire and heroically sliding into an elevator – before reappearing to smash the lens of a pursuing rover camera.

“That was three separate takes that we had to marry together,” says Whitehurst, revealing that Powell appears in the first and last parts of the shot, while a stunt performer completed the floor slide in the middle. “That stuff is pure invisible effects. You need to get them all into position and use CG where you need to. We had a CG digi-double take over between the different poses that weren’t quite matching across the takes. It was a fun shot.


Digital doubles and extensive face replacements were used during chases and a pivotal moment where Richards narrowly escapes an explosive head-on collision, plunging from a bridge into the river below. The film’s finale features a completely CG V-Wing airplane, digital explosions, and spherical roving cameras capturing the action.

“Most of the time they’re hanging around like vultures,” Wright says, “and in some sequences there are three of them buzzing around. And in those cases, we had to constantly work out the choreography of where they were.”

The roving cameras provide live coverage for the audience watching the show on TV. But they also presented a visual effects challenge whenever a rover-eye view was simultaneously displayed on an in-world monitor. To maintain continuity, artists had to make sure that the angle seen in the rover’s video feed properly matched its constantly changing position.

“Steve Hardy deserves a medal,” notes Dave Zaretti, “for not only looking after the big exterior shots of the V-Wing, but also the hundreds of shots inside of it – keeping track of which rover cams should be seen where, not only in the main plates but also in the TV inserts.”

All of it adds up to a film jam-packed with tons of action and more than 1,700 visual effects shots.

“The effects work is huge, and subtle at the same time,” says Wright. “There’s a shot where Glen is in a New York hotel and gets into an elevator, and in the background is Times Square. But because the focus is on Glen the entire time, this amazing futuristic Times Square vista, with all of the screens, is completely out of focus. It’s a show where I feel there’s an enormous amount of work in the background, but out of focus. I think it’s really cool.”

A short sequence depicting Richards saving the life of a fellow oil-rig worker is only four shots, but is described by network executive Dan Killian (Josh Brolin) as “the most thrilling 10 seconds of video I’ve seen all year.” “We were very much beholden to deliver the most exciting 10 seconds of footage,” Whitehurst quips. “No half measures.”

The oil rig, crashing waves, lightning, and rain were fully digital elements. The actors were shot against a green screen. “We had two very dry actors dangling from a string,” adds Zaretti. “So we had to try and integrate them into the scene. But I think those shots worked really well.”


Work on The Running Man was hubbed out of ILM’s London studio, with further contributions from ILM artists based in Mumbai. Rodeo FX and Untold Studios completed additional shots. The key to a great end result, Wright believes, is all departments working together in sync.

“There’s incredible work by Andrew and ILM in the movie,” Wright says. “But it’s always in conjunction with something else – whether it’s the camera, an amazing location, what production design has done, what physical effects are doing. And the thing I’m really proud of in the movie is that all of this is people working together out of mutual respect.

“There are very few entirely green screen shots,” continues Wright. “And I think what people misunderstand about great visual effects is, they say, ‘It’s all CG.’ But of course, the best work is where it’s actually a collaboration.”

Whitehurst and Zaretti believe Wright’s style and approach to directing help bring the best ideas to life. “There was creative wiggle room,” Zaretti says. “And that’s nice, because you don’t always have that creative breathing space. So enjoy it and let the artists shine.”

“Absolutely,” concurs Whitehurst. “Edgar is definitely somebody who is very open to being shown something he was not expecting. It’s great seeing his enthusiasm when we show him stuff for the first time, and seeing him relax and go, ‘Oh, it’s going to be okay.’”

Wright says he’s most impressed by the world-building in the film, full of details that may only appear for a few seconds but make a lasting impression on the audience. “I wonder whether we set a dangerous precedent for ourselves by actually delivering in under a year,” the director laughs. “I’m really, really proud of the work, and I think some of the shots are just exceptionally beautiful and rich and detailed. What I also like about it is, it doesn’t feel like a lot of the effects are grandstanding.”

At the end of the day, Whitehurst says he is continually impressed by the ILM team’s innovative spirit that brought The Running Man over the finish line.

“ILM is a very refreshing place to work because there is so much experience, but it’s always in the service of making beautiful pictures that help tell the story,” he says. “I’m agnostic about what technology we use. I just want to use the right pencil for the job. But ILM has all of the pencils, and more importantly, the people who know how to use all of those pencils.”

Pre-order The Running Man Limited Edition Steelbook now.

(Credit: ILM & Paramount).

Clayton Sandell is a Star Wars author and enthusiast, Celebration stage host, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell), Bluesky (@claytonsandell.com), or X (@Clayton_Sandell).

One of the biggest days in American sports is equally renowned for its iconic commercial spots.

The artists from Industrial Light & Magic have contributed visual effects to two original commercials as part of the broadcast of Super Bowl LX, the celebrated championship game of America’s National Football League.

Lucasfilm’s newest feature film from the Star Wars galaxy, The Mandalorian and Grogu, is set to premiere on May 22, 2026, and the original spot directed by Jon Favreau, “A New Journey Begins,” provided audiences with a touching moment between the production’s namesakes. ILM’s contributions include bringing the icy world of Hoth, along with a group of tauntaun creatures, to the screen alongside the beloved characters.

For another spot, ILM returned to one of its most iconic visual effects achievements with Steven Spielberg’s Jurassic Park (1993). For the new commercial directed by Taika Waititi in partnership with Xfinity, ILM created a Tyrannosaurus rex, Dilophosaurus, and a herd of Gallimimus, all inspired by the company’s work on the classic film.

Watch “A New Journey Begins”:

Watch “Jurassic Park…Works”:

Read more about the latest artistry and innovation from Industrial Light & Magic here on ILM.com.

The visual effects supervisor discusses ILM’s contributions to director Ryan Coogler’s supernatural sensation, which is nominated for “Outstanding Supporting Visual Effects in a Photoreal Feature” at the VES Awards.

By Jay Stobie

(Credit: ILM & Warner Bros.)

Helmed by writer/director Ryan Coogler, Warner Bros.’ Sinners (2025) defies the boundaries of traditional genres, telling the story of Elijah “Smoke” and Elias “Stack” Moore (both played by Michael B. Jordan) as they return home to establish their own juke joint. Forced to deal with the cruel inequalities of 1930s Mississippi and a ravenous vampire named Remmick (Jack O’Connell), the twin brothers’ exploits are paired with an exhilarating blues soundtrack and live performances. ILM visual effects supervisor Nick Marshall (The Last of Us [2023], Dune: Part Two [2024]) joined ILM.com to outline Industrial Light & Magic’s visual effects contributions to the film, which included a near-fatal encounter with a symbolic rattlesnake and a train’s entrance into a bustling station.

A Bewildering Briefing

“As the ILM visual effects supervisor on the film, I looked after a small body of work, by ILM’s standards, as we handled just under 100 shots covering two different sequences,” Marshall explains to ILM.com. ILM’s work on Sinners took place at ILM’s Vancouver studio, and Marshall recounts how the first story overview he received left him slightly perplexed.

“Early on, people were quite tight-lipped over what this movie was. We had an excellent collaboration with the filmmakers in terms of the technical understanding of the visual effects we were going to do, but the bigger picture was kept close to the chest. Sinners was outlined to us as a period movie with a gangster element to it, and racial segregation was also an important plot point. At the same time, it’s got a big supernatural element, and Michael B. Jordan is playing twins. And, as all of this is going on, the heart of the movie is about blues music.

“When we got this initial brief, we thought it was a bit bizarre but sounded very interesting,” Marshall continues. “There were so many elements to the story that it got to the point where I asked the client if it was going to be a comedy [laughs]. They said, ‘No, it’s actually a serious horror movie.’ As we progressed with our work, the plot unfolded, and we understood a lot more. Even though we didn’t necessarily know exactly what the movie was about until a long way into production, there was such a passion from everyone going into it. We could tell we were working on a pretty special project with a clear vision behind it.”

Rumble with a Rattlesnake

The first ILM sequence to appear in the film centered upon the twins discovering a rattlesnake concealed in their truck bed, leading Smoke to stab the snake with a knife and toss its bleeding body onto the ground. “Ryan described this moment as foreshadowing, as it echoes the moment when Smoke has a standoff with Cornbread [Omar Benson Miller] outside of the juke joint later in the movie. The snake scene is the first time you get a sense that there’s something off and maybe even supernatural about this world that they’re in, so we went deep into trying to give that meaning. The idea of familiars foreshadowing bad events repeats often in Sinners – there are ravens, crows, and vultures which appear throughout the film and establish that there’s a tension building.”

In terms of the snake itself, the ILM team had plenty of information to base their visual effects on, as Marshall notes, “Ryan Coogler is an avid snake collector and knows a lot about them, so we were going to have to do justice to the real thing. Ryan told us how snakes that are shedding their skin get more aggressive – they’re in a heightened state because they’re weakened by it – and that they get a layer of skin that creates a cloudy-eyed look, which ties in nicely to the tapetum lucidum effect that the vampires’ eyes have in Sinners. Production shot a real timber rattlesnake in the back of the truckbed for us as reference, and the one place where we deliberately made a creative change which differed from the real snake, was to give the eyes that shedding-skin appearance. [Animation lead] Agata Matuszak sat down for weeks looking through internet references to find every perfect snake attack shot that she could locate – snakes striking into the camera, at a mouse, or toward balloons and seeing how quickly they move. She became a brilliant source on snake locomotion.”

The scope of ILM’s contributions to the snake sequence changed over time as the edit continued to evolve, with Marshall declaring, “The reference served us well as far as providing a snake to visually match to, but we were originally only supposed to do a single shot of the snake. They were going to capture the snake being uncovered, waking up, and all of that practically, and we would take over when the snake had to be stabbed. However, they couldn’t get a performance out of the real snake on the day of – it was happily toodling around in the truck bed – so we took over and delivered around 10 shots of the snake. Not all of those shots made it into the movie, because they were still experimenting with ideas for what actions they wanted from the snake.”

(Credit: ILM & Warner Bros.)

The ‘Pool Noodle’ Process

ILM’s approach to the snake sequence needed to factor in the reality that Smoke would be interacting directly with the knife and the animal itself. Marshall relays, “For lighting purposes, we had to make sure that the knife was blocked accurately when Smoke brought it down over the top of the snake’s head. We did a basic digital double version of Smoke and reprojected some of the plate back onto that so – if you ever saw anything in a reflection – it had the correct texture of his hands and suit. We kept the knife as practical as we could but eventually took it over because the knife very visibly enters the snake’s head. We reconstructed the knife prop and used that for the reflections of our CG assets and blood interaction.”

Turning to the blood that pours from the rattlesnake, Marshall says, “Our effects team did amazing blood simulations for the blood spatter, and when you see all of the interaction with the knife. When the knife pins the snake’s head, you get blood leaking out from underneath and spreading across the truck bed, as well. Our effects team totally nailed it. The lone aspect we came in to touch up was the blood pumping out of its neck wound when it’s tossed into the long grass. You get a few shots of it writhing around as it’s dying. That ended up being a combination – the effects simulations for the blood spatter as it hits the ground, and we also did a practical blood element shoot at the last minute to sell the sense of the viscous blood pooling on the surface and trickling down the sides.”

This impromptu ILM shoot involved an unexpected tool, as Marshall shares, “We shot it in my backgarden, where it was a construction using a pool noodle that had been hacked apart to represent the snake. We rigged up blood packs to pump blood out of an artificial wound that we had cut into the pool noodle. Our comp team, under the supervision of Okan Ataman, made the best of it and pulled it together using the most successful elements from our effects simulation and our practical shoot. That combination got us over the line, and the client was extremely happy with it.”


Locomotive Magic

The second sequence ILM presided over focused on a train arriving at a Mississippi station. “We did a lot of photorealistic reconstructions of the Clarksdale train station,” Marshall begins. “When we started discussing it, production was going to have a real locomotive come into the station. We were just going to do environment extensions. As it turned out, they struggled to make it work with the timing and location, so we took over the train component too. That became a big deal for us, because we had assumed the practical train would block a lot of our environment throughout the sequence.

“The train wound up being full CG itself, and we were shooting directly into the green screen – which is daunting with visual effects because it can telegraph itself,” Marshall divulges. “Fortunately, production assisted us by building the green screen to the exact height of the train so we’d get correct shadows. Since the train was green as well – which is true to the period – a small amount of green spill contaminating the environment wasn’t necessarily the worst thing. Normally, that would be bad for us, but here it served as a fantastic reference for where the top of the train should sit. It gave us the correct shadow and lighting for what came over the top of the train once we put the train in to replace the green screen. This allowed us to have wonderful light interaction with the characters.”

Real-world references are essential to visual effects work, particularly when dealing with such a distinct time period. “We had tremendous art department concepts that production designer Hannah Beachler put together in collaboration with our production visual effects supervisor, Michael Ralla. Those served as the basis for the broader design of the streets, the kind of signage we would get, and other specifics like that. We supplemented them with our own research for period details about the streetlighting and electronics you’d expect around the trains. At that point, they were going through the transition from steam locomotives to electricity. [Environment supervisor] Anton Borisov pored over old photographs to see what Clarksdale really looked like in the 1930s. We went into an extensive research period to figure out the mechanisms that were active on the rail at that point and how the carriages looked, right down to the numbers you see on the side,” Marshall reveals.

The film’s setting had an impact on ILM’s responsibilities, as Marshall elaborates, “In collaboration with the client, we ensured there was a sense of racial segregation to the environment, so you could see a clear delineation between the side of the tracks reserved for whites and the side that was designated for Black people. It was a key plot point for the movie, and we wanted to do justice to it. Beyond that, our goal was to make everything look as photoreal as possible. The client wanted our work to blend seamlessly so no one would notice that they occasionally relied heavily on visual effects. In certain cases, the visual effects took over the majority of the frame, but it always had to disappear and be completely invisible.”

(Credit: ILM & Warner Bros.)

Buildings, Automobiles, and Bystanders

Although production built a full-scale physical train station to be used on the set, ILM nevertheless offered support in post-production. “We did a bit of repair work on the train station itself, which they did shoot practically. Production put significant time and effort into the building with the understanding we’d handle the majority of the environment extensions. There were a few places where tiles were missing and shingles fell off, and we did the train tracks that the train rolls in on,” Marshall adds.

Research and reference entered play once again when it came time to outfit the environment with traffic and pedestrians. “As far as the cars you see, we sought to add life to the background,” Marshall remarks. “We didn’t want the set extension to feel static, so we researched Ford automobiles from around that time. Cars were fairly limited in their color palette in that era, as you’d tend to mostly have black paint, so we didn’t have to do too much variance in colors for the automobiles.

“Our CG supervisor, Anthony Zwartouw, gathered reference photography and basic assets, then we pieced together the design of the vehicles,” Marshall continues. “Then, we went about building them, and our animation team did a phenomenal job of making the cars seem as if they were trundling over uneven ground. Our compositing team led by Michael Ranalletta, was outstanding too – they went in and fleshed out the background with 2D sprites of people. Production shot a ton of 2D elements of people milling around, holding luggage, talking, and walking about. We used those to populate the backgrounds and achieve the busy, lived-in environment that they wanted.”

(Credit: ILM & Warner Bros.)

“Invisible” Involvement

Marshall indicates that Sinners was shot on large format film, and the shots which ILM worked on were shot on System 65, “a super-wide format that’s been used on movies like The Hateful Eight [2015]. Since film is shot infrequently these days, we had to rebuild and relearn certain parts of the pipeline to be able to cope with that. What might’ve been simple requests in a digital workflow, such as additional takes of a particular shot or extending a frame range, became quite complex and expensive. They had to order those frames to be re-scanned, and there aren’t many people who do it anymore. Certain points were tricky, as it can be difficult to naturally ingest physical film into our digital pipelines. [Color scientist] Matthias Scharfenberg did remarkable work to give us a color process that allowed us to progress with the movie.”

“Along with that, there’s a physical quality to the way Sinners was shot that we needed to emulate too. [Director of photography] Autumn Durald Arkapaw shoots on deliberately de-tuned lenses, and she’ll send them off to have little abnormalities and subtle effects appear in the lenses because she wanted them to have character. Where a lens might customarily be uniform across its surface, hers had slight shape changes. Our compositing team with [comp supervisor] Okan Ataman and [comp lead] Michael Ranalletta did some significant lens profiling work, because things would warp in strange ways and go in and out of focus in places you wouldn’t expect.”

“[Digital artist] Florian Sanchez literally sat for over a month just profiling our lenses to see if we could set the tooling internally, which took a perfect CG render and added those subtle changes. When we’d get our perfect CG renders to come out, we then applied all these effects that were – in many cases – degrading the actual image. However, that made it feel exactly like it was this real film format that could’ve been used to shoot this movie 40 years ago. We’d add tons of grain and noise over the top, plus small dust hits here and there. We wanted our work to not look digital in what was otherwise a sort of analog movie, so we had to reapply those details to our CG.”

Other less-obvious contributions which ILM made to their assigned sequences tied in to the director of photography’s preferred approach to lighting shots, as Marshall conveys, “Autumn shoots with a lot of negative fill – she’ll put up huge black flags and canopies just out of camera to block light and create shaping on the characters’ faces. It’s a more natural way of working, but it meant that the 360-degree high-dynamic range images we’d usually use to light our scenes had massive black canopies in them. When they filmed at the station, they didn’t have a representation of the train when they shot it for real, so when we put the train in, we started getting reflections of those canopies. [CG supervisor] Anthony Zwartouw made sure the lighting across the sequence was solid. Anthony helped us come up with a good system for lighting these shots in a way that let us do a big pass at taking that stuff out digitally and projecting the results onto a low resolution LiDAR scan of the location, so we had a clean digital set which we could use to light our assets.”

(Credit: ILM & Warner Bros.)

Motion Picture Partners

Marshall’s effusive praise for his ILM team extends to the client-side filmmakers, as the visual effects supervisor beams, “My direct contacts on Sinners were Michael Ralla and visual effects producer James Alexander. Not far into our work, I understood that they had a super collaborative team who valued our input. Michael made regular efforts to stop us from referring to ourselves as a vendor. Michael said, ‘You aren’t a vendor, you’re a partner in this process, so let’s scrap that word from the playbook right now.’ To their credit, they walked the walk. When they were out on set, they’d call us to consult on critical decisions. ‘We’ve hit this problem, so do you want us to put a green screen here or there?’ They wouldn’t be able to do both because of budget or time constraints, but they’d involve us in those conversations.”

These discussions benefited ILM’s workflow immensely, as Marshall asserts, “We weren’t waiting to receive plates and unaware of what we’d be getting to work with. We were incorporated into the decision-making and could plan what we had to do.” Marshall attributes the collaborative spirit that permeated the production’s departments to the director, observing, “Ryan’s a collaborative filmmaker, and he’s always present – he’s engaged and drives the film. There was so much respect between the departments, and we each tried to resolve issues you may not even think about.”

Marshall credits the members of his ILM production team for keeping them on schedule and adapting to editorial needs. “Ryan was making adjustments to the timing of the cut as it’s the first time Smoke and Stack separate from each other in the movie, so we built our process to be flexible and allow Ryan the freedom in the edit room.. Our producer Alan Cummins, [project manager] Kaisha Williams, and the production team worked tirelessly to guarantee that we were left with enough time to fit in everything that we needed to do, which enabled the filmmakers to continue making those last-minute changes.”

(Credit: ILM & Warner Bros.)

Valuing Visual Effects

Marshall emphasizes Ryan Coogler’s appreciation for visual effects, professing, “In recent years, you’ve seen a continued trend toward pretending there’s no visual effects involvement in films. Ryan bucks that trend. Ryan shot as much as he could for real on Sinners, but he’s a huge proponent for visual effects being a part of the process. For Ryan, visual effects are absolutely indispensable to the team. It’s nice when a director goes out of their way to give credit to the visual effects companies as another department that brings their vision to life. We never felt as if our contributions were being downplayed.”

Speaking to the role of visual effects in filmmaking as a whole, Marshall says, “All visual effects artists and supervisors want to do is deliver the best work and help people tell interesting stories. Keeping visual effects as part of the conversation is something to be championed, and Sinners definitely did it right. The more that those conversations happen out in the open, the better. When we wrapped our last shot, Ryan sent us a video to thank the team and recognize us for our work. Sinners was a special show to be on in that way.”

The thoughtfulness displayed by Coogler and the entire Sinners crew made the experience of working on the project an exceptional one for Marshall, who concludes by offering his perspective on the completed film, affirming, “Ultimately, there’s so much symbolism in Sinners that I hope people can keep digging back into it and watch the movie again and again. There are so many intriguing parallels and nods to later scenes scattered throughout, and a great deal of history and culture that are crucial to the movie. So much thought went into every frame, and every shot has a purpose and a meaning. Because of that, and because ILM was brought into the loop to aid in making it that way, I hope Sinners will be one of those movies that stands the test of time. In 10 to 20 years, people can rewatch it and say, ‘I’ve never caught that moment even though I’ve already seen it five times. That’s a really nice detail!’”

Watch an ILM exclusive deleted scene from Sinners:

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

ILM artists share their insights about this distinct installation now on view at Somerset House in London.

By Jamie Benning

The first thing visitors encounter inside Wayne McGregor: Infinite Bodies at London’s Somerset House is darkness. A vast LED screen fills the room, its shifting light reflected across the faces of those watching. The space is quiet at first, then sound begins to breathe into the room. Two figures slowly emerge. Their bodies twist, merge, and reform, suspended in a deep digital expanse that feels both intimate and endless. It is OMNI, a collaboration between choreographer Wayne McGregor and Industrial Light & Magic, and it sets the emotional, thematic, and sensory tone for everything that follows.

As an entry point, OMNI does not explain itself in conventional terms. It does not offer narrative, character, or spectacle in a familiar cinematic sense. Instead, it presents presence. Motion without edges. Energy without beginning or end. Viewers gather instinctively. Some stand for seconds. Others for many minutes. The work opens and closes in cycles, dissolving into darkness before returning again, as if inhaling and exhaling. It is an installation that encourages stillness before it encourages movement.

Reuniting McGregor with the creative teams and technologists at ILM for the first time since their work together on ABBA Voyage in 2022, OMNI invites viewers into what McGregor calls “a choreographic exploration of the infinite potential of human connection.” Using ILM’s performance capture and simulation technologies, dancers Rebecca Bassett-Graham and Salvatore De Simone perform an unending duet of energies. Their movement is captured, transformed, and re-presented inside a boundless digital plane, a place where the distinction between human and virtual becomes beautifully uncertain.

You do not see the performers themselves, but ghost-like representations and shadows of their digital footprint. Luminous networks of the skeletal and nervous systems glow and decay in cycles of light. Around them, murmuration-like particles drift and swirl, sometimes responding to the dancers’ movements, and at other moments seeming to lead them, creating a shifting dialogue between motion and environment. Form appears, dissolves, reforms. It is at once biological and architectural, organic and synthetic.

The installation was created with ILM visual effects artists Matt Rank, Xavier Martin Ramirez, Edward Randolph, Arnaud Mavoka-Tama, Mike Long, Julien Ducoin, Oscar Dahlén, Alessandro Pieri, and Bimpe Alliu, with generous support from ROE Visual and Studio Wayne McGregor. Positioned deliberately as the first work visitors encounter, the piece anchors the exhibition and mirrors Somerset House’s broader mission to explore the intersection of art, technology, and society.

Concept art by Bimpe Alliu (Credit: ILM).

Conceptualizing Motion

Concept artist Bimpe Alliu was brought onto the project at a very early stage to help shape those initial visual directions. “I was going through the ideas with Matt [visual effects supervisor Matt Rank], and learning more about what the project was going to entail,” she said. “I was playing around with storyboarding a lot of potential movement ideas.”

Rather than beginning with dancers as recognizable figures, the creative team quickly gravitated toward abstraction. For Alliu, the use of murmurations became central to expressing motion without relying on literal anatomy. “It is a beautiful way of capturing movement while still existing as a solid form. There is all this motion and synergy happening at the same time,” she explained. “It also allows for a lot of natural push and pull, which is really exciting when you are thinking about animation.”

Her design process embraced freedom over prescription. “It is fun because it is nice to do something that is a little more abstract. It gets the brain thinking in different ways and allows you to deconstruct form and how to portray it,” she said. “You can tap into other references and use them in different ways. You might think, ‘I can approach this in a completely different way from how I usually would.’ You have a wider park to play within. There is no wrong idea. They are just ideas, and either they land, or they do not.”

Although OMNI was always destined for a monumental screen, Alliu said it was vital not to let format dictate imagination. “You start big, knowing it is going to be on a screen, but you do not let that stop you,” she said. “There was never a point where it felt like, because it is on the screen, you cannot do this or that. It was very much ‘blue sky’ thinking.”

Concept art by Bimpe Alliu (Credit: ILM).

Somerset House and the Culture of Collaboration

Introducing the exhibition, Somerset House director Jonathan Reekie described Infinite Bodies as a perfect embodiment of the institution’s ethos. “Wayne McGregor and Infinite Bodies, in so many ways, encapsulates what Somerset House is all about. Most visibly, Somerset House has been about reimagining the historic building for the future. We developed a cultural program that sits between different art forms, the intersection of culture, technology, and society at large. We are conscious that we are in an ever-changing creative landscape, and therefore, artists are changing and working in different ways all the time. We need to create a space for that.”

Reekie also pointed to the importance of community in shaping new ideas. Somerset House is home to a creative community of nearly 3,000 artists, makers, and entrepreneurs, and collaboration sits at the core of its identity. “We don’t believe great ideas always come from an individual working on their own in a room. They come from community, from groups of people coming together and making great things,” he said. “That’s the way Wayne works.”

For McGregor, this collaborative model has defined his career. His studio, based in East London’s Olympic Park, has long operated as a laboratory for experimentation with scientists, technologists, and other artists. Infinite Bodies brings together more than three decades of his interdisciplinary works and investigations into the subtleties of movement, both human and non-human. Each work within the exhibition operates as an experiment, a proposition about the body’s potential and how technology might allow us to perceive it differently.

Dr. Cliff Lauson, director of exhibitions at Somerset House and co-curator of Infinite Bodies, recalled that McGregor’s polymathic approach was one of the defining reasons he wanted to collaborate. “It was several years ago, and that impulse that I felt about Wayne’s work, and what might make for an interesting exhibition at Somerset House, now has been so gratifying to see after so many years,” he said. “Interesting ideas come out during conversation and collaboration. It doesn’t help anybody to be working in silos.”

Physical Intelligence

When Wayne McGregor took the microphone at the exhibition launch, he spoke with warmth and generosity about the complex process of translating live choreographic practice into a gallery environment. “I want to say thank you to Somerset House and to Cliff and Jonathan and to all my team at Studio Wayne McGregor because it’s been a huge challenge,” he said. “I’m used to making. I’ve made something like 200 pieces at this stage. That’s been a huge passion for my kind of choreographic practice. But I’ve always had a parallel practice, and that parallel practice has been in research and testing ideas around the notion of physical thinking.”

At the heart of Infinite Bodies is McGregor’s long-standing interest in what he describes as physical intelligence. The exhibition invites visitors to reconsider their own physical awareness and their relationship to technology. “Technology is not outside of ourselves,” McGregor said. “The body is the most technologically advanced thing we’re looking at. I’ve not seen any form of technology that surpasses the living body, its ability to create, to respond, to be spontaneous, to feel.”

This tension between embodiment and computation, between instinct and algorithm, runs through OMNI in particular. It reframes dance through light and motion while still preserving the physical presence and emotional weight of performance.


From Film to Immersive Space

For ILM, OMNI provided a rare opportunity to apply cinematic tools to an environment that does not behave like cinema. “I’ve spent my career digitising the real world, real people, real environments, and turning that into assets that we can use in CG,” said Matt Rank, ILM’s visual effects supervisor on the project. “My role has taken me full circle. Now we’re taking computer graphic content and displaying it back in the real world.”

Rank’s work sits at the intersection between traditional visual effects, virtual production, and emerging immersive media. “What we found from ABBA Voyage and with Infinite Bodies is that shared experience, people coming together is what matters,” he said. “That’s where we’re pushing our content and technology, a shared, meditative experience that people can have together.”

Working with McGregor offered a fundamentally different creative starting point. “We’re used to working with studios and directors who have very specific briefs on where they want the creative to be led. Working with Wayne was a blank canvas. We spent a lot of time understanding what this piece should be, and maybe more time understanding what it shouldn’t be,” Rank said. “He didn’t necessarily want to see the form or the shape of his performers, but how their movement reflected through the body. That became the first aesthetic for the piece. From his feedback, we then started work with our own art department, bringing these ideas and concepts to life to present back to Wayne in more unique visual forms that would set the tone of the final piece.”

Rank described OMNI as “… an abstract piece, but it is quality, it feels photographic. Even though there’s real-world elements about it, it kind of sits within your psyche. It doesn’t make you feel uncomfortable either.” He also pointed to natural phenomena as key influences. “There is a nod to how birds flock and dance in the sky, and how plankton can emit light when it is disturbed by movement, and these carried through into the final piece, represented by the murmurations seen across the two looks.”

ILM Beyond the Frame

Reflecting on ILM’s wider creative mission, Rank said, “We’re all storytellers. We love collaborating with directors, but doing that in new forms and new mediums is incredibly exciting. Our world is becoming ever more interconnected, and we’re keen to explore what we can do in those spaces. With a smaller team, you can be really agile and get to results faster.”

For Alliu, projects like OMNI underline the breadth of what ILM represents today. “It showcases to people that want to work with companies like ILM that there are so many different kinds of projects we create,” she said. “People do not just think of ILM as one kind of thing anymore. They start to see that there is a much broader range of work.”


Seeing the completed exhibition in sequence gave Alliu a new perspective on how OMNI functions within the wider narrative of Infinite Bodies. “Coming straight out of seeing the work that we did and then going into everything else, it suddenly felt like being grounded back in the physical space again,” she said. “It was really nice seeing all the different interpretations of movement and communication, the role digital plays in that, and the fact that it still very much needs the physical. It still very much needs the body, the person, and the movement.”


She also sees that visibility as vital to the industry’s future. “It gives people the understanding and the option to think, ‘This is something I could do,’ and they start to think about careers in this space,” she said. “We still need that human input. You still need that eye, that instinct, and that creativity behind it all.

“Art and science have always co-existed on many levels, and technology is also a part of that, especially as a lens by which we’re able to understand, explore and conceptualise both,” Alliu added. 


As Reekie observed, Somerset House’s role is to provide artists with the space to imagine new futures. “It’s up to the artists to tell us what the future might look like because they’re the best people to help us navigate it,” he said.

Infinite Bodies offers one such vision. A space where choreography, light, and digital craft meet. A reminder that innovation is not just about machines, but about the bodies, instincts, and creative impulses that continue to drive them. ILM continues to explore how those instincts and tools translate beyond film and into shared physical spaces.

Concept art by Bimpe Alliu (Credit: ILM).

Jamie Benning is a filmmaker, author, and podcaster with a lifelong passion for sci-fi and fantasy cinema. He hosts The Filmumentaries Podcast, featuring twice-monthly interviews with behind-the-scenes artists. Visit Filmumentaries.com or find him on X (@jamieswb) and @filmumentaries on Threads, Instagram, Facebook, and YouTube.

ILM.com is showcasing artwork specially chosen by members of the ILM Art Department. In this installment of a continuing series, four artists from the San Francisco, Vancouver, and London studios share insights about their work on the 2025 Netflix production, The Eternaut.

Supervising Art Director Fred Palacio

During pre-production, one of the key ideas here was to show how the characters were trapped in the city, isolated from the external world. The snow here is the first lethal weapon that killed most of the population, but something else is happening. A barricade along the Puente Saavedra shows that something else is happening, something more extraordinary. This keyframe shows the character isolated against all the odds, the snow, the loneliness, the urban chaos. 

One of the most important things working on the project was to have the vision of the people who live there when this is happening. The Client and the novelist were from Argentina where the film is played. So the first step for authenticity was to become immersed in the Argentinian world. Diving into memories of the city I visited and merging with an exact location, walking through street views online. Finally, translating the situation into a frame, one by isolating the character, but also using the bridge to undermine his power, the point of view and camera position is determinant to sell the situation of the character. 

The resilience to overcome the giant wall made of all sorts of human-made things to suggest the Alien presence, even the sign in the bridge is a message to the viewer translating “everything has a prelude.” The element here needs to reflect how an ordinary man in an ordinary world resists all the extraordinary events and obstacles. The green bag means a forward action, the red light tells not to go back, the perspective of the bridge points back to the car and another figure hinting to cohesion…all these elements tell something about the story but also about the character’s attitude toward those obstacles.

Art Director Amy Beth Christenson

This is an early study for a specific neighborhood in Buenos Aires, just after the snowfall, where Juan is discovering the aftermath. I worked to position cars and people so that it conveyed a sense that what happened was sudden and unexpected. I researched the original comic quite a bit, and also did a lot of research to make sure that the specific neighborhood was accurate so that it felt very real.

I like the sense of a rosy pre-dawn, almost peacefulness to the scene, which is a contrast to what has happened. Looking at the day-to-day life images of people, and thinking about what it would look like if they were taken mid-stride, gave me ideas, like a woman walking her dog, people carrying groceries, etc., which helped the images feel more eerie.

I was on the project just for the very early initial concepts, specific to the immediate aftermath of the snowfall, and what those moments might look and feel like, and didn’t iterate beyond these. At these early stages, I wanted to get ideas for lighting and composition down early, and worry about details later.

Art Director Tyler Scarlet

This piece depicts alien creatures that are about two feet tall and who can work in a pack. The client really liked the look of microscopic dust mites, so I used that as a starting point and expanded from there. They responded to different elements from my first round, so I worked on combining the hard-shelled version with one that looked similar to a dust mite. The next step was to show it in action. I explored concepts of it attacking people, wrapping bodies in its web, and dragging them away. They are also scavengers so I did an illustration showing that as well. 

For the first pass I wanted to give the client a range of different types of creatures while still fitting the brief of a six-limbed dust mite-like creature. One version was very close to a realistic, large dust mite, another version had a hard shell, jointed legs and claws at the end of its limbs to grip onto its prey, and the third version was more aerodynamic and looked like it was built to move fast. I like how it looks when it’s coiling its web around its victim! [laughs]

This client was one of my favorites I have worked with. They came to every meeting with such excitement, passion, and appreciation. 

See the complete gallery of concept art from The Eternaut here on ILM.com.

Learn more about the ILM Art Department.

Frankenstein and The Lost Bus are recognized while ILM also contributes to two other nominated films.

The BAFTA Film Awards announced their 2026 nominees today, and artists from Industrial Light & Magic have earned two nominations in Outstanding Visual Effects for their work on Frankenstein and The Lost Bus.

ILM visual effects supervisor Ivan Busquets joins fellow visual effects supervisors Dennis Berardi and Ayo Burgess and model effects supervisor José Granell for director Guillermo del Toro’s Frankenstein.

And for The Lost Bus from director Paul Greengrass, ILM visual effects supervisor David Zaretti joins production visual effects supervisor Charlie Noble and special effects coordinator Brandon K. McLaughlin.

Additionally, ILM contributed to other Outstanding Visual Effects nominees Avatar: Fire and Ash and F1.

Congratulations to our ILM nominees! The 2026 BAFTA Film Awards will be held on February 22 in London. Read the full list of nominations here.

Read more about The Lost Bus here on ILM.com:

Rendering a Rescue: ILM’s Dave Zaretti on the Visual Effects of ‘The Lost Bus’

ILM artists for Jurassic World Rebirth and The Lost Bus earn nominations.

Nominations for the 98th Oscars were announced today in Los Angeles, and Industrial Light & Magic contributed to all five nominees in the Best Visual Effects category: Avatar: Fire and Ash, F1, Jurassic World Rebirth, The Lost Bus, and Sinners.

Artists from Industrial Light & Magic earned two nominations in the category.

For Jurassic World Rebirth, our ILM nominees include production visual effects supervisor David Vickery, animation supervisor Stephen Aplin, and ILM visual effects supervisor Charmaine Chan, along with special effects supervisor Neil Corbould.

“Still wrapping my head around the Oscar nomination for visual effects on Jurassic World Rebirth,” Vickery tells ILM.com. “I’m immensely proud of the work. Thank you to everyone who voted for us, but above all – well done to everyone who poured their hearts, souls, and creativity into this special project. You all deserve this!!!”

Aplin adds that “this nomination is such a fantastic reflection on the hard work and dedication the entire ILM team has contributed to the visual effects of Jurassic World Rebirth. Personally, my passion for this craft was jump-started after watching the original Jurassic Park when it first came out in theaters, so getting to play in that world and receive such a fabulous honor is a dream come true. Thank you, and congratulations to all nominated in this category.”

“To be nominated for our visual effects on Jurassic World Rebirth is an absolute honor,” says Chan. “Like so many in this industry, the original Jurassic Park was the film that made me believe the impossible is possible. That motto rang true across our global ILM teams as they passionately created stunning imagery to bring Gareth Edwards’ vision to life. ​I am incredibly proud of what we accomplished together and thank the Academy for this recognition.”

The Jurassic World Rebirth nominees at the recent visual effects bake-off event in Los Angeles (Credit: ILM).

For The Lost Bus, ILM visual effects supervisor David Zaretti has been nominated along with production visual effects supervisor Charlie Noble, beloFX visual effects supervisor Russell Bowen, and special effects coordinator Brandon K. McLaughlin.

“Wow! What an honor to be recognised by the Academy for the work we did retelling the story of Paradise,” Zaretti tells ILM.com. “I’m so proud of the whole team for their hard work and creativity. It was a great experience to work with Paul Greengrass, helping him do what he does best. Given the quality of the visual effects across the board this year, it feels extra special to make it this far.”

The Lost Bus nominees at the recent visual effects bake-off event in Los Angeles (Credit: ILM).

Congratulations to all of our ILM teams for their work on this year’s nominated films, and best of luck to our ILM nominees!

Read more about Jurassic World Rebirth and The Lost Bus here on ILM.com:

“What Do We Have To Do To Make it an 11 out of 10?”: Visual Effects Supervisor David Vickery on ‘Jurassic World Rebirth’

Rendering a Rescue: ILM’s Dave Zaretti on the Visual Effects of ‘The Lost Bus’


Cutting-edge digital artistry, modern inspiration, and retro callbacks help launch the latest Tron adventure from the cyber world into reality.

By Clayton Sandell

Light Cycles, Super Recognizers, and Programs roar off the Grid and into the real world for the first time in Tron: Ares (2025), a four-decades-in-the-making moment that challenged Industrial Light & Magic to deploy a full creative arsenal to make the impossible real.

ILM’s David Seager served as the production visual effects supervisor for the third entry in a franchise that began with the original 1982 film Tron and continued with 2010’s Tron: Legacy. The first Tron movie follows the adventures of software engineer Kevin Flynn (Jeff Bridges), who is trapped inside a neon digital realm where computer programs appear as human avatars.

Then-nine-year-old Seager became an instant Tron devotee. “I was very excited when this opportunity came along, so I definitely jumped at it,” he tells ILM.com. “Tron, for me, was right up there with the Star Wars franchise and many of those types of films.”

Directed by Joachim Rønning, Tron: Ares stars Jared Leto as the titular hero, a sophisticated Master Control Program reporting to Dillinger Systems executive Julian Dillinger (Evan Peters). Ares is billed as the ultimate soldier and the first artificial intelligence being – or construct – to appear in the real world. But outside of the Grid, Ares can only live for 29 minutes, sending Dillinger and rival company ENCOM on a quest to find Flynn’s long-lost Permanence Code that will extend a construct’s lifespan. When ENCOM CEO Eve Kim (Greta Lee) discovers the code first, Dillinger dispatches Ares and his second-in-command, Athena (Jodie Turner-Smith), to track her down and steal it.

Inspired by Modern Tech

Inside a massive Dillinger complex hangar, Ares and Athena– along with their Light Cycles – are brought into physical form by an array of rapid-firing red particle lasers attached to robotic arms.

“Using lasers to get to and from the Grid has been part of Tron since the beginning,” Seager explains. “So we knew there was going to be a laser component to it. But also, I thought it was a great opportunity to show that the Dillinger company isn’t making games anymore. They’re more a part of the military-industrial complex, so it was always important that it had an industrial feel.”

During preproduction, design inspiration came from a 3D printer purring away in the art department. “It was in one of our meetings where we just happened to look over, and there was a print in progress,” recalls Seager. “And it had the support structure surrounding it, this kind of ‘jig’ structure, as we called it.”

Incorporating the concept of 3D printing helped ground the sequence in a visual language people are familiar with, Seager explains. There’s even an added storytelling flourish when the mass of rough, excess jig pieces builds up and suddenly collapses, exposing the object underneath. “We wanted it to feel messy, and then it just falls away, and there’s the creation. It was one of those happy accidents,” Seager says. “It became a really great reveal.”

Concept art by Jason Horley (Credit: ILM & Disney).
(Credit: ILM & Disney).

Cycles and Walls of Light

Riding their Light Cycles at high speed through nighttime city streets and across bridges, Ares and Athena pursue Eve in a sequence largely shot on location in Vancouver, Canada. “It became very evident that we all wanted to go shoot as much as possible on location,” Seager recalls. “You get a million little things that happen organically.”

On set, modified Harley Davidson electric motorcycles stood in as proxies for the Light Cycles, outfitted with practical lighting to cast realistic reflections and glow onto the wet pavement. “It became our job in visual effects to go in and replace the proxies that we created for the Light Cycles,” says Seager. “We had to replace 100% of them.” The special effects department also built hero versions of the Light Cycles for shooting close-ups of the actors, either against a blue screen or an LED screen.

During the chase, the Light Cycles emit a signature Tron element in their wake: lethal ribbons of reddish, semi-transparent light. The challenge, Seager explains, was making the light walls work visually in a non-Grid environment.

“That was more traditional look development work – adjusting the amount of refraction, reflection, and brightness and those types of things,” according to Seager. “There’s a little bit of heat distortion. We want it to feel hot. And because it was very easy for it to feel glassy, and there’s a certain brittleness that comes with glass, you’re like, ‘Oh, we don’t want that.’”

(Credit: Disney).

In one of the film’s most memorable shots, a light wall slices a police cruiser into perfect halves, an effect that uses a combination of practical and digital techniques.

“That was a real car,” Seager reveals. “The special effects team was like, ‘Oh, we could build this!’ So they took a car and chopped it right down the middle lengthwise. It was a repeatable stunt, and there was limited steering they could do after the split. We ended up having to shoot it a couple of times, but the vast majority of what you see is the stunt that we shot. And then we have to go in and make the edges seem glowing hot – like it just got cut – and add steam and those types of things coming out.

“Light Cycles are to Tron like lightsabers are to Star Wars,” Seager adds with a smile. “I’m so proud of what we achieved in the Light Cycle chase.”

One of Seager’s favorite moments in the sequence is a Light Cycle sideways slide that pays homage to an iconic shot in the landmark 1988 anime action film, Akira. “I’m a lifelong anime fan and fell in love with Katsuhiro Otomo’s manga of Akira and was one of the first kids in town to obtain a VHS copy of the anime,” Seager says. “Needless to say, it is very rare to be able to work on a project that combines two influential films from your childhood.”

A climactic street battle between Ares’s and Athena’s armed Dillinger sentries features a weapon that proved to be one of the more challenging effects to pull off: the Light Staff.

“It’s the fun new weapon that was introduced in our film. The idea is a staff that you could fight with, and the ends emit a white ribbon four or five inches wide,” Seager explains. “We came up with the idea that you’d have this almost dial-up lifespan, so the light ribbon could last a second, or two seconds, or five. We knew Joachim always wanted them as long as possible, but there were times when they had to go away.”

The Light Staff fight required complex coordination between the actors and stunt performers on set, but the frenetic pace of the action sometimes created unavoidable visual conflicts. “I’d be sitting there going, ‘Wait a second, if they swipe like this and then they run forward, their head just hit the thing,’” remembers Seager. “You have to almost think in terms of, ‘Oh, I need to duck under this.’ I think everyone did a great job of trying to choreograph the fights.

“We got as close as we could during shooting,” Seager continues. “And then in postproduction, we went in there and started tracking the staff and emitting the beam from it. We just started going, ‘Oh, there’s a problem there.’ And you just have to go try other things.”

Seager says some fun and unexpected ideas also popped up during shooting. “The stunt team came up with the idea of characters making a light ribbon and using it to jump off of,” he says. “So there are cool moments where Ares basically creates terrain for himself.”

Another visual quandary came with the Super Recognizer, a massive flying security transport that Athena pilots into the city as she searches for Eve. “The design work was beautiful. I think our biggest challenge was how big it was,” Seager says. “We had our LiDAR and survey data of real Vancouver streets, and when we put those two together for the first time, we’re like, ‘Okay, the Recognizer doesn’t fit into any street.’

“There’s a fair amount of digital surgery where we had to kind of wipe the city away because if you make the Recognizer too small, the threat goes away,” continues Seager. “So we were trying to find that balance. But the main work we did there was trying to find ways to make it fit.”

(Credit: Disney).

Enter the Grid(s)

Much of the look of the ENCOM and Dillinger Grids is inspired by designs established in Tron: Legacy by production designer Darren Gilford, who returned for Ares.

“The Dillinger Grid – the red one – definitely followed the aesthetic of Tron: Legacy with a dark, shiny, almost wet look to it. It’s atmospheric, and it has a stormy feeling,” says Seager. “Darren always talked to me about that Grid being inspired by circuit boards.”

For both Grids, the production built a combination of complete and partial sets on a Vancouver soundstage. “Most of the big sets that we built were for the Dillinger Grid,” Seager says. “There were two major red rooms. One we called the ‘extraction’ room, which is where Eve is printed into the Grid and where Ares later escapes. And then there was what we call the ‘regeneration’ room.”

Seager credits the production art department for crafting beautiful, practical sets that, in many cases, only needed minimal digital enhancement, like adding ceilings or extending walls. “Early on in the show, I took some pictures as we were building the set and doing walkthroughs, and I sent them to one of my fellow ILM supervisors because they were very pretty. And they were like, ‘Oh, that’s great looking concept art.’ I was like, ‘That’s not concept art!’” Seager laughs.

For a sequence in which Ares and his team infiltrate the blue-tinted ENCOM Grid, ILM took on extensive digital world-building. “That was a little more traditional blue screen work,” says Seager. “We built minimal floors and then expanded from there because the characters had to cover a great distance. We built the staircase that we could shoot against, but in post, we did the rest of the environments.”

Seager explains that the ENCOM Grid also offered a chance to break from a traditional nighttime look to portray a more daylight setting. “We just wanted it to feel thematically brighter,” he says. “It’s the sunny, good-guy Grid. It’s still overcast, but it’s not quite darkness. That has its own challenges because light lines look great when it’s dark out, but if you turn the lights up and also have a competing bright scene, now you’re trying to make the bright light lines work.”

(Credit: Disney).

Still hunting for the Permanence Code, Ares is transported inside Flynn’s original server,

providing audiences a nostalgic visit to the relatively primitive digital landscapes of the 1982 classic. “It was a lot of fun, and I actually consider it one of the more challenging developments on the show,” explains Seager. “ ‘Challenging’ usually means ‘big, big, big.’ And this one was challenging going the other way. It’s stripping away, it’s simplified.”

Executing the retro look of the Flynn Grid fell to the team at Distillery Visual Effects in Vancouver, which worked to incorporate updated versions of the distinctive visual artifacts from the 1982 film, like flickering faces, desaturated skin tones, and backgrounds marked by noticeable “frozen grain.”

“In visual effects, if you have frozen grain, your shot is broken,” Seager notes. “In our shots, we intentionally added frozen grain to the background to try to make it look that way. The light suits built by WETA Workshop were immaculate, but we actually made them kind of flicker and the edges kind of wobble because we wanted to have a little bit of a hand-rotoscoped feel.”

Seager says the Flynn Grid is loaded with Easter eggs – including an appearance by the binary guide known as Bit – that he hopes fans will pick up on. One of his favorite throwbacks can be seen as Ares takes control of a classic yellow Light Cycle and follows Bit off the Game Grid through the same jagged hole used by Flynn and his companions to make an escape back in 1982.

“We went in, and we looked at that exact break pattern. True fans hopefully can see that it’s the one they made 40 years ago,” Seager says.

(Credit: ILM & Disney).

Opening the Complete ILM Toolbox

ILM Stagecraft’s LED volume technology proved invaluable for scenes set in very different exterior and interior environments. Assembled on a soundstage at Mammoth Studios near Vancouver, the volume completed the snowy landscape around a remote mountain station in Alaska, where Eve and her partner, Seth Flores (Arturo Castro), use the Permanence Code to assemble an orange tree in the real world.

“There were also two offices. Dillinger’s office, which overlooks the transfer bay, was built maybe 16 feet up, then we hung LED screens outside the windows. And the scene out there was a fully realized 3D version of the transfer bay,” Seager says. “The ENCOM office set also had an LED cityscape when you looked out the windows.”

The production employed the same volume ILM used for season one of the Disney+ series Percy Jackson and the Olympians (2023-present). Scenes inside the Grid featuring Ares speaking with Julian Dillinger’s digital visage utilized MEDUSA, the Academy Award-winning facial capture system developed by ILM and Disney Research Studios. For action scenes, ILM FaceSwap tools were used extensively to put an actor’s likeness onto a stunt double.

Director Joachim Rønning and actor Jared Leto (Ares) on the set (Credit: Disney).

The Home Team Advantage

Work on Tron: Ares was primarily divided between ILM’s Vancouver and Sydney studios, with additional contributions from Distillery Visual Effects, Lola Visual Effects, Image Engine, Prologue, GMUNK, Imaginary Forces, and OPSIS. Seager, who lives in Vancouver, says having Tron: Ares shoot in his home city provided a rare opportunity for the ILM team to observe the production process up close.

“For the artists, it’s huge,” Seager insists, “because it’s really hard to get experience on set for up-and-coming talent. We had a great relationship with the production team, so I was able to bring a lot of the folks out to get their first-ever on-set exposure. We tried to take advantage of that as much as possible.”

Seager adds one more Vancouver factoid: When an F-35 fighter plane slams into the Super Recognizer, the massive craft crash-lands in front of a building that in real life is only half a block from ILM’s Vancouver studio.

(Credit: Disney).

Now You See Him, Now You Don’t

Tron: Ares contains just over 2,100 visual effects shots, but Seager says there’s one illusion the audience will never notice. Early in the film, Dillinger introduces Ares to a group of shareholders. Appearing for the first time inside a Dillinger Systems Amphibious Rapid Response Tank, or DART, he wears a black Light Suit with glowing red accents and a highly reflective helmet hiding his face.

But when the scene was first shot, Jared Leto was not wearing a helmet at all.

“An idea came in postproduction to have him be this faceless automaton that reveals at the right moment,” Seager says, explaining that it was up to digital artists to craft a highly reflective CG helmet from scratch, matching it perfectly with the Light Suit and practical environment. Adding to the challenge: Leto had long hair that needed to be painted out of every shot.

“I don’t think people will ever know the work we did,” remarks Seager. “The camera is inches from his face in some of the shots where we had to track the helmet in there. In that entire scene, those helmets are all digitally added shot-by-shot.”

(Credit: ILM & Disney).

End of Line

Seager has much praise for the hundreds of artists and collaborators who made working in the Tron universe such a rewarding challenge, and especially director Joachim Rønning.

“Paramount to Joachim’s vision was that he never wanted this to feel bigger than life. It’s really easy to toss a lot of gimmicks at something set in the real world, and all of a sudden it starts to not feel as grounded. So it was trying to find that sweet spot where it felt like you could believe you’re watching from the street corner.”

Instead of watching from a distance, however, Seager found himself at the creative center of the Tron universe, drawing on 40-plus years of fandom to help bring the latest chapter to the big screen. “It was a dream come true,” he says.

Read more about Tron: Ares here on ILM.com:

ILM’s Jeff Capogreco and Jhon Alvarado Take Us Into the Grid of ‘Tron: Ares’

Inside the ILM Art Department: ‘Tron: Ares’

Clayton Sandell is a Star Wars author and enthusiast, Celebration stage host, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell), Bluesky (@claytonsandell.com), or X (@Clayton_Sandell).

As Kathleen Kennedy steps down from Lucasfilm leadership to return to producing, Dave Filoni will lead the studio as President and Chief Creative Officer alongside Lynwen Brennan as Co-President.

Lucasfilm announced today that after 14 years of leading the studio, President Kathleen Kennedy is stepping down from her role. Kennedy will return to full-time producing, including the studio’s upcoming feature films The Mandalorian and Grogu and Star Wars: Starfighter

Dave Filoni, who worked closely with creator George Lucas to build the Lucasfilm animation department on Star Wars: The Clone Wars and helped launch Star Wars live-action series alongside Jon Favreau on The Mandalorian, will take on creative leadership of the company as President and Chief Creative Officer and Lynwen Brennan will serve as Co-President. 

Their close collaboration and more than 30 years of combined senior executive experience will carry Lucasfilm into its next chapter of storytelling, with a strong foundation of creative vision and operational leadership guiding the studio forward.

To read the full announcement, visit StarWars.com.

Visual effects supervisor Eric Leven takes us behind the scenes of the high-speed film from director Joseph Kosinski.

By Jamie Benning

There is a moment early in F1: The Movie (2025) when the film quietly makes a promise to its audience. Before we have settled into the present-day story, and before we have learned the rhythms of the modern races, we are pulled back into the past, into the memory of a catastrophic crash that defines Sonny Hayes (Brad Pitt) long before we meet him. It is a short sequence, but it carries the same burden as the first dinosaur reveal in Jurassic Park (1993). If the audience does not believe this moment, everything that follows becomes harder to land.

That opening crash is not simply exposition. It is the film’s tonal contract, shaped quietly by Industrial Light & Magic, with more than a little help from the real-life, near-fatal crash of racing driver Martin Donnelly at Jerez in 1990.

ILM visual effects supervisor Eric Leven described the challenge in clear terms. “Motorsport fans have watched countless hours of real racing footage, so they instinctively know when something feels wrong. A film, however, cannot simply document reality. It has to reshape it into something emotional and cinematic. Accuracy alone is never enough.”

I have worked in Formula One television production for more than twenty-five seasons, and when the Donnelly sequence began as I watched the movie, I recognized the real-world imagery within a second. Almost immediately, that recognition dissolved, and I found myself inside Sonny’s memories. I wanted to understand how ILM helped achieve that transition and set the tone for the entire movie.

(Credit: Apple & ILM).

Dreaming in VHS

The conceit of the opening sequence is simple and effective. Sonny dreams in VHS. Editorial had mocked up an early version of the look, but everyone knew how unforgiving that format could be.

“Everyone knows exactly what real VHS looks like,” says Leven. “And if it is just a tiny bit off, you can tell that it was done in post or that it is not real VHS.”

Rather than rely on digital simulations, ILM turned to genuine analogue artefacts. Leven had digitized old family VHS tapes, complete with dropouts, noise, and tracking errors. Those became the foundation of the sequence.

“We were able to lift glitches from real VHS tapes,” Leven explains. “Our compositing supervisor, Heath Kranak, put that material together and mimicked the rest of the VHS look with the color desaturation and low fidelity and it matched perfectly. It was a really, really fun sequence to work on.”

The result does not feel stylised. It feels remembered and slightly damaged. The fragility it imparts is central to the emotional impact of the moment.

(Credit: Apple & ILM).

Rebuilding History, Donnelly, Senna, and the 1990s

Texture is only one part of the illusion. Many of the elements that appear in the Sonny Hayes crash exist because ILM reconstructed them digitally. The sequence blends archive racing footage of Martin Donnelly with new photography shot at the F1 legacy circuit Brands Hatch. Crucially, the archive was not something to be polished. It was the aim.

“The archive footage was the target look we were going for,” Leven says. “That became our roadmap for what the other footage needed to look like.”

The new material had to bend toward the old. Stand-in cars did not match the shapes and proportions of early 1990s Formula One vehicles, so ILM made significant changes. “To me it looked like a Formula One car from the 1990s,” Leven notes. “But people said, no, the air scoop is different, and the tires are a little bit fatter. So we ended up replacing Senna’s car in its entirety.”

Branding needed the same attention. Logos removed on set were later reinstated for reasons of authenticity. “At that time they had Marlboro advertising,” Leven recalls. “So we added those logos onto Senna’s car and on some of the billboards.”

(Credit: Apple & ILM).

Playing With Recognition

For viewers who know the real Martin Donnelly crash at Jerez in 1990, there is an immediate flicker of recognition when the sequence begins. The angles, trackside details, and violence of the moment feel unmistakably familiar. Yet within seconds, that certainty slips.

The yellow car remains, but the driver is no longer Donnelly. The incident has been reframed as Sonny Hayes’s defining memory, and from that point on, the sequence belongs to the character rather than history. ILM is not inserting new material into archival reference. It is reconstructing a memory, taking an incident that fans may hold vividly in their minds and reshaping it so the audience feels both recognition and unease.

That approach extends beyond the car itself. Although the sequence was shot at Brands Hatch, ILM removed contemporary details, replaced the environment, added period-appropriate crowds, and regraded the landscape to resemble the Spanish circuit of the early 1990s. For seasoned Formula One fans, this is where the spell takes hold. They recognize the shape of what they are seeing, but begin to question its ownership.


The Crash That Is Not There

One of the most dramatic shots in the sequence, the car losing control and heading toward the guardrail, appears to be captured entirely in camera. In reality, it is almost fully digital.

“The crash was shot as the camera car was driving normally around a curve,” Leven says. “At a certain point, we took over. It basically became a full CG shot because we needed to replace the entire background and make it look like it was crashing into the guardrail.”

Once ILM replaced the environment, the car needed work too. “We needed to vibrate the wheels and make it look like he is going off the road,” Leven explains. “One of the wheels goes askew. Maybe 90 percent of the car was replaced.”

Even the driver’s hands on the steering wheel were animated later to make his struggle more believable. The goal was never pure spectacle. It was to make the audience feel the loss of control while subtly layering in the foundations of Sonny Hayes’s early racing story.

(Apple & ILM).

Daytona Nights

If the opening crash sequence sets the emotional foundation, the Daytona material sets the film’s visual rhythm. Once again, the work begins with practical filmmaking. “Joseph Kosinski was all about shooting as much as possible for real,” Leven says. “Let visual effects help only where you cannot get exactly what you want.”

Units captured a wealth of footage and reference, from headlight sweeps to subtle brake behavior. This allowed ILM to integrate story beats seamlessly into authentic environments.

When the narrative required Sonny to be surrounded by several cars as he exited the pits, ILM added those cars. When positional indicators on the sides of vehicles needed to reflect a different moment in the story, ILM updated them.

“It is great to be at ILM where you can say, ‘Absolutely, we can do that, and it will look seamless.’ We had a lot of fun adding all kinds of little details,” Leven says.

When the script asked for Sonny’s competitors to have mechanical failures, the same principle was followed. ILM kept real sparks and flame bars where possible, added smoke and oil when required, and extended practical effects only where the story demanded it.

In some cases, that meant going far beyond enhancement. Entire vehicles were replaced or rebuilt in visual effects when the practical footage could not deliver what the story required. Stand-in cars became different models. Background vehicles were added wholesale. In certain shots, only fragments of the original plate remained once the work was complete. It was not about spectacle, but precision. The cars had to behave correctly, brake at the right moment, shimmy under deceleration, and sit convincingly within the real racing environment.

(Credit: Apple & ILM).

Fire, Fabric, and Pixels

ILM’s work also appears in some of the film’s most intense moments, including the crash that engulfs Joshua Pearce (Damson Idris), the rival driver to Pitt’s Sonny Hayes. “There is a shot where you see his whole back,” Leven says. “On the day, it was just a bright white driving suit. We made it look burned and added ash.”

It is painstaking but important work, and most of it is invisible to the audience. But it matters because tire compound colors carry meaning. In Formula One, the colour markings on a tire indicate the compound being used, which in turn signals grip level, durability, and race strategy. For fans who understand the language of racing, those colors instantly communicate how hard a driver can push and how vulnerable they might be at that moment. If the color is wrong, the story beat is wrong, even if it only occupies a couple of pixels on screen.

(Credit: Apple & ILM).

The Myth of “No Visual Effects”

There is a recent marketing trend proclaiming that certain movies use very little or even no visual effects. Leven finds this both complimentary and misleading. For him, visual effects are simply one of many tools that support the filmmaking process.

“Obviously, we are using visual effects, and obviously we try to make them as seamless as possible, and that is what makes it amazing,” Leven says. “Though I would not mind if people could talk about how scenes were shot for real, but also used visual effects, and actors, and props, and sets. It’s all part of the filmmaking process.

“It is great when people watch the movie without noticing any visual effects. Ideally, nothing takes them out of the moment,” he adds.

(Credit: Apple & ILM).

Why Filmmakers Come to ILM

Leven is clear about what ILM offers when filmmakers come to the studio. “We have such a rich history,” he says. “When filmmakers come to ILM, we want them to be comfortable knowing we share their vision. We are all filmmakers here, and we want to push the boundaries on every project to create incredible imagery.” 

The production of F1: The Movie was notably smooth from ILM’s perspective. “There were no problem shots,” Leven says. “It was just executing a plan.”

The work was shared between the San Francisco and Mumbai teams, with each location taking ownership of full shots from start to finish. This created a genuine around-the-clock workflow that supported the film’s editorial pace.

(Credit: Apple & ILM).

The Tone Is Already Locked In

By the time we leave Daytona, the film no longer needs to ask for the audience’s trust. The visual language has been established and proven. The speed feels credible. The danger feels earned. The emotional weight of the story is already in place.

That trust is built on work the audience will rarely notice. VHS glitches lifted from real tapes. Crowds added to empty grandstands. Tire markings adjusted by only a couple of pixels. A racing suit digitally burned to reflect the impact of a crash. Cars are rebuilt so subtly that the original plate becomes almost invisible. All of it supports the story without ever interrupting it.

The tone, the thing that convinces us this world is real and worth investing in, is established quietly by artists whose success is measured not just by what the audience sees, but by what they never question.

(Credit: Apple & ILM).

Jamie Benning is a filmmaker, author, and podcaster with a lifelong passion for sci-fi and fantasy cinema. He hosts The Filmumentaries Podcast, featuring twice-monthly interviews with behind-the-scenes artists. Visit Filmumentaries.com or find him on X (@jamieswb) and @filmumentaries on Threads, Instagram, Facebook, and YouTube.

The 24th annual presentation by the Visual Effects Society will take place on February 25, 2026.

The Visual Effects Society announced the nominations for their 24th annual awards ceremony, and the creative teams from Industrial Light & Magic earned 16 in all. These include overall nominations for Jurassic World: Rebirth and The Lost Bus in Outstanding Visual Effects in a Photoreal Feature, Sinners in Outstanding Supporting Visual Effects in a Photoreal Feature, and Andor in Outstanding Visual Effects in a Photoreal Episode.

ILM’s work was celebrated across a wide range of categories, with ten individual productions recognized, including Andor, Avatar: Fire and Ash, Jurassic World: Rebirth, Lilo & Stitch, Severance, Sinners, Superman, The Lost Bus, Tron: Ares, and Wicked: For Good.

The VES Awards gala will be held on February 25, 2026 at the Beverly Hilton Hotel in Beverly Hills, California.

Congratulations to all of our ILM nominees!

Read more about these productions on ILM.com:

“Like Eating an Elephant One Bite at a Time”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

“Let the Experts Be the Experts”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

Assembling a Starfighter: Exploring ILM’s Role in Creating the TIE Avenger from ‘Andor’

“What Do We Have To Do To Make it an 11 out of 10?”: Visual Effects Supervisor David Vickery on ‘Jurassic World Rebirth’

Inside the ILM Art Department: ‘Lilo & Stitch’

The Invisible Visual Effects Secrets of ‘Severance’ with ILM’s Eric Leven

How ILM Helped James Gunn’s ‘Superman’ Soar with High-Flying Visual Effects

Inside the ILM Art Department: ‘Superman’

Rendering a Rescue: ILM’s Dave Zaretti on the Visual Effects of ‘The Lost Bus

ILM’s Jeff Capogreco and Jhon Alvarado Take Us Into the Grid of ‘Tron: Ares’

Inside the ILM Art Department: ‘Tron: Ares’











Artists from ILM’s Sydney studio take us into the Grid to discuss their part in the cult classic’s latest chapter.

By Jay Stobie

(Credit: ILM & Disney).

Directed by Joachim Rønning, Disney’s Tron: Ares (2025) breaks the barrier between the physical and digital realms, as the Master Control Program known as Ares (Jared Leto) rebels against his creator, Julian Dillinger (Evan Peters), and seeks the Permanence Code that would allow him to achieve a lasting existence in the real world. Ares finds an ally in Dillinger’s corporate rival, Encom executive Eve Kim (Greta Lee), whose empathetic nature is a stark contrast to the ruthless disposition of the Dillinger Systems leader. From the Grid’s luminous avenues to their concrete counterparts in our physical reality, Tron: Ares brims with astonishing visual effects that support its characters on their tumultuous journeys.

With Industrial Light & Magic’s own David Seager serving as the production’s overall visual effects supervisor, ILM proved uniquely suited to spread the visual effects work across its global studio sites in Sydney and Vancouver. Operating from the Sydney studio, ILM visual effects supervisor Jeff Capogreco (Jurassic World [2015]; Avengers: Infinity War [2018]; The Mandalorian [2019-23] and ILM animation supervisor Jhon Alvarado (Dungeons & Dragons: Honor Among Thieves [2023]; Alien: Romulus [2024]; Star Wars: Skeleton Crew [2024-25]) sat down with ILM.com to discuss their behind-the-scenes insights into all things Tron: Ares.

Tron’s Legacy

“On TRON: Ares, I was the visual effects supervisor for ILM’s Sydney studio, and my partner in crime was ILM associate visual effects supervisor, Alex Popescu,” Jeff Capogreco shares with ILM.com. “Early on, we made a conscious decision to have two technical camps going at one time, and each of us took on different roles and responsibilities. As the ‘grandpa’ supervisor, or ‘Papa Jeff,’ I worked with Alex to make sure things ran smoothly. The Sydney studio did just over 800 shots, which I believe was the biggest show to date that our studio had delivered, so it was a proud milestone.”

Capogreco’s love for the Tron franchise stretches back to its initial cinematic installment. “I’m old enough to say that Tron (1982) was one of the first movies I ever watched on VHS. My father was really into technology, and having a VHS player was a status symbol in Canada in the 1980s,” Capogreco beams. “I was probably six or seven and didn’t fully understand what was happening in the movie, but it was spellbinding. I was fascinated by Tron, and that led me to want to do visual effects. What Tron did to me, I hope Tron: Ares does to other people.”

ILM animation supervisor Jhon Alvarado’s own affinity for Tron took off with the release of its sequel, Tron: Legacy (2010). “For me, Tron hit home when Legacy came out, primarily because of the visuals and the soundtrack,” Alvarado remarks. “I loved the marriage of the two elements and how well they came together. When Tron: Ares came up for ILM, I knew I had to be on it. We’re here for movies that take us into these awesome worlds, and Legacy delivered with its sound, music, and visuals. When Tron comes in, you know it’s Tron. It has an aesthetic that you can’t get in any other film.”

(Credit: ILM & Disney).

An Animation Approach

Turning to his tenure on Tron: Ares, Alvarado supplies an overview of what his duties entailed. “As the ILM animation supervisor, my work on the show covered quite a bit. I was involved in shot design and creating shots for the film. We’d receive sequences with a rough previs of what the idea would be, but once the previs was put into the cut, I’d occasionally get blank frames with descriptions of what was supposed to happen. My job became interpreting them, taking the shots that the director had in mind, and Tron-ifying them while making it all feel believable and realistic. I had to think like a cinematographer so that, even when the shots were full CG, they appeared as if they were filmed for real.

“I developed several vehicle animations, as well,” Alvarado continues. “For the light skimmer chase sequence, I partnered with our ILM animators to figure out the style and movement of how these vehicles skim through the water. We examined speedboat references so we could get the water spray right.” Alvarado’s mission extended to the laser printers that enabled Grid-based vehicles to be constructed in the real world. “We designed how the laser moved and collaborated with the rigging and effects departments to choreograph the printing. We had rigs which let us play with how the laser looked, its size, and where it was pointing.”

Alvarado selects a brawl between Ares and an army of Encom’s own Programs as another pivotal scene for his animation team. “We created digital doubles during the fight sequence where you see Ares being attacked by soldiers. At ILM, we have tools that permit us to get motion capture data integrated quickly, so we choreographed the movement alongside our animators and internal mo-cap team. I believe the filmmakers originally had a stunt performer do it, but he was only fighting maybe two or three guys, and the rest was air-fighting.” Along with adding in flying discs that caused the deresolution of Ares’s opponents, Alvarado’s team had a hand in mapping out the timing of each character’s ‘derez.’“

Scene management was important on Tron: Ares, especially for the light walls,” Alvarado notes. “In terms of animation, my role varied. People often associate animation with characters and creatures, but we’re also figuring out the timing, the choreography, and the cinematography so that we deliver a nice flow of the sequence to the director. Once that’s established, we pass it on to Jeff and all the other departments to use as a base to build upon.”

(Credit: ILM & Disney).

Going Into the Grid

Designing the Dillinger Systems Grid proved to be a monumental task for ILM’s Sydney studio. “We had a fantastic production model given to us from the client side that was visualized in real time with a neat flythrough,” Capogreco states. “That provided the basis of what we called the Motherboard, or the main facility that overlooked the whole Dillinger Grid. Once we started getting the Grid into shots, we realized it wasn’t nearly big enough for what Joachim wanted. We did a test where our layout folks put the characters on a light skimmer and drove them from one end of the Grid to the next. It turned out to be about four times too small, because they’re moving so fast. We ended up increasing the original size by a factor of two-and-a-half. It’s massive!”

Seeking to furnish the Grid with “scale and purpose,” Capogreco researched ports and military facilities as a way to subtly infuse features that would cause audiences to recognize distinct areas, such as factories and military zones, within the Dillinger Grid.

ILM relied on its expertise to know when it had to dial back the intricacy to fit the film’s setting. “Tron has a design language that’s all about shape and form – it’s not about being very busy. What I mean by that is, when you’re down at the water level, there’s an incredible amount of complexity that’s put into each shot. Every shot, to a degree, had to be designed,” Capogreco outlines. “We needed to translate what people see in the real world into Tron-like assets. What does a buoy look like in Tron? What about a crane or a shipping container?”

While the appropriate level of detail was required for scale at the surface perspective, Capogreco relays that “when we were up high, it became too busy and noisy, with many moving dots. So, we actually built two Grids. One sufficed for the wider views and was a more simplified version. Our team went in and hid much of the noise, guaranteeing the audience would at least see the shapes and silhouettes. When you’re down at water level, the geometric resolution needs to increase. Every section of the environment was fairly unique and required a huge design process to support the narrative.”

(Credit: ILM & Disney).

The Face of a Foe

Another key Grid-related component deals with the holographic avatar that Julian Dillinger projects to communicate with Ares within the Dillinger Grid. “They filmed Evan Peters saying Julian’s lines, and then we used ILM’s MEDUSA tool to generate a perfect match of Evan’s performance to incorporate into our asset,” Alvarado remembers. “Joachim and Dave asked us to experiment,” adds Capogreco, “so we tried making the eyes brighter, having no eyes at all, playing with the wireframe, and evaluating how much we wanted to see of Evan versus seeing the avatar of Evan.”

After numerous creative back-and-forths, ILM opted to stick closer to the original concept art that inspired the face’s depiction. Speaking to the experimental process, Capogreco notes, “We eventually did a cross-blend between a human skull and Evan’s face, which was important to the director. We wanted the mouth, nose, and brows to be genuinely Evan, but we also wanted to bring the creepy, evil avatar side out.” Alvarado adds, “I analyzed Evan’s eyebrows so the avatar would resemble him. Evan has a very specific and strong brow shape, but because the hologram didn’t have any eyebrows, we had to manipulate the model to make the shape of the brow a little more like Evan’s.”

(Credit: ILM & Disney).

The Vaunted Vehicles

The production built a life-sized light cycle that helped define that particular vehicle’s mechanics, but other assets – such as the light skimmer, jump jet, and drones – left room for ILM artists to rely upon their imaginations. “We studied speedboat footage for the light skimmer in an effort to get its rooster tail to match the reference,” Capogreco explains. “Finding something you can anchor to is a terrific starting point for visual effects, and then it evolves. What can we do to make this weird? I welcomed ‘weird’ in dailies. The weirder the comments, the better. It was fun to experiment in that playground. If you pause on frames in the movie, you’ll see that the flares actually have texture – they have Grid patterns. Usually, when water is on a lens they’re circular, but one of our ideas played with the notion that – when the splash hits the lens – we actually refracted cubes onto the lens.”

“The light skimmer scene was heavily influenced by older Star Wars cinematography, especially the speeder bike sequence and the Millennium Falcon exiting the Death Star as it exploded,” Alvarado notes, in reference to Star Wars: Return of the Jedi (1983). “We used ILM’s work on Return of the Jedi as a reference and evaluated what made those scenes so special. You may notice that Skywalker Sound even added some of the speeder bikesque sound effects to the light skimmer in some of the shots [laughs]. There’s a lot of ILM’s DNA in Tron.” 

Although director Joachim Rønning wished to ground the drones that pursued the light skimmer in realism, he enjoyed Alvarado’s suggestion to have them rotate. “In the real world, the drones wouldn’t spin, but when Joachim saw them in the chase, he thought it looked great. It’s all about making everything fun for the audience.”The jump jet flown by Athena (Jodie Turner-Smith) opened another creative door for the team. “

The jump jet design was never a practical build so it wasn’t as defined as the light cycle, so we were left to figure out how it would take off and fly. At first, the pieces didn’t move in an aesthetically pleasing way – it was too mechanical and simple. The director wanted the movement to be realistic, but he trusted us to make it look cool. So, we utilized tools we had originally developed for the Transformers films that let us dynamically move pieces around. When Athena gets in the jump jet, she presses a button and the pieces are sort of Transformer-y,” Alvarado professes. “Joachim loved it. If you analyze it, the movement of the pieces doesn’t necessarily make sense, but there is an element of believability to it.”

(Credit: ILM & Disney).

Assisting Ares

From the Grid to the jump jets and beyond, ILM delivered a visual effects extravaganza to Tron: Ares. However, while the vehicle assets and other prominent action sequences tend to receive the bulk of the audience’s praise, a significant portion of ILM’s assignment revolved around visual effects that are virtually invisible in the final cut. One example of this centers on Ares’s appearance throughout the beginning of the film. “For the entire Dillinger board presentation, Jared Leto was filmed without a helmet on. We were tasked with creating the visor that went over top of him. The decision to add a helmet onto Ares served the story and led to his big reveal moment,” Capogreco conveys.

The actor’s hair turned out to be a slight obstacle for ILM’s Sydney studio, as the team sought ways to cover up the parts that would have stuck out from under a helmet. “Jared had long hair that came over his costume, so we augmented parts of his collar and costume to tuck that hair back in,” Capogreco asserts. Alvarado concurs, recounting, “It also aided us in the early battle sequence, because that too was meant to be Jared Leto with no helmet. When the choice was made to use a helmet, we were able to go full CG with the character. When Ares stands up and walks over is Jared’s performance, but I don’t think people are able to tell that the other parts are full CG. Hopefully, they believe it’s a photographed suit!”

As with Julian Dillinger’s avatar, the filmmakers ensured that the Ares actor’s performance remained. “We wanted to keep Jared’s nuances in the close-up shots, but the digital double freed up Jhon for the wide and action shots,” Capogreco affirms.

(Credit: Disney).

A World of Work

As a global studio with multiple sites around the world, ILM spread its Tron: Ares responsibilities across its Sydney and Vancouver locations. “[ILM visual effects supervisor] Vincent Papaix and [ILM associate visual effects supervisor] Falk Boje at ILM’s Vancouver studio were wonderful collaborators, and Jhon often touched base with [ILM animation supervisor] Mike Beaulieu,” Capogreco underscores. “The wonderful thing about Tron: Ares is that the workflow was intended to be fairly autonomous, where we could do our own thing but still share our vehicle assets and digital doubles. Each studio took on different aspects of the film. It was nicely split out in a way that we could complement each other’s work.

“The deresolution – or the black sand effect – in the real world was an excellent example of that,” Capogreco continues. “Vancouver handled most of the end battle sequences, and they pioneered Ares’s time-out effect inside the Dillinger facility when we first see a Program turn to ash after 29 minutes. Subsequently, we handled the shots that were closer to Athena’s face by using the Vancouver team’s shot setups and adapting them to what they needed to be for the close-up.” The same held true for assets, which Vancouver required, that had been designed by ILM’s Sydney studio, such as the jump jet used for a shot where two guards land just before Athena and Ares square off towards the end of the film.

Capogreco broadly summarizes the delegation between sites, stating, “Vancouver covered the bluer and sleeker Encom Grid, and they owned the light cycle chase and the majority of the city work. On the other hand, in terms of the real world, Sydney had the mountain test site and the printing of the orange tree, as well as a substantial number of shots coming and going from the Grids. ILM’s Sydney studio was also involved with everything around the Dillinger Grid, even the on-set segment where partial sets were built. To film, they had to poke lights into the set, so we’d go in to top up the shots by augmenting roofs, set extensions, and other small cosmetic fixes here and there.”

“The biggest elements that the two studios collaborated on directly were the light walls and the tools to generate them,” Alvarado adds. “Even when we were dealing with different assets, we all needed to have light walls for the vehicles, drones, and staff weapons.” Capogreco delivers an additional shout-out to ILM StageCraft, saying, “ILM StageCraft was used for shots where you’re up in Dillinger’s office peering out. Sections of the office were built and filmed, so that’s all in-camera work. That and the mountain test site, which they shot on an ILM StageCraft volume before we went in and did extensions on top of that.”

(Credit: ILM & Disney).

Taking Pride in the Project

Reflecting on the project as a whole, Capogreco notes, “Without hesitation, I would say that Tron: Ares was one of the hardest shows I’ve done. Not just because it is such a crazy technical achievement, but because it pushed our creativity to the limit. Joachim Rønning and Dave Seager weren’t necessarily aiming for formulated shots. They were open to ideas from Jhon, Alex, and I. That’s what made Tron: Ares extra special – we went on a journey together to find the best possible shots. Joachim had a good eye for motion, and he challenged the ILM teams in Sydney and Vancouver. He kept us honest about having designs be as realistic as possible without sacrificing the ‘weird.’ He emphasized that, when you’re in the Grid, it’s okay to allow some stuff that’s strange.”

When asked for his thoughts on the film, Alvarado replies, “Although Tron: Ares looks real and we were going for realism, there is a large amount of it that is computer graphics. For example, the city that Vancouver built for the light-cycle chase and the fighter-jet flight seems so realistic that I don’t think people realize a ton of it is computer graphics. It demonstrates the artistry at ILM, and the love and care we put into the details to make Ares as believable as possible. Every pixel, every camera, and every full-CG shot had to live in that world. Everyone gave it their best, and it looks fantastic. There were so many full-CG shots, and they are as good as the live-action photography. It’s a testament to the team.”

In a year where visual effects-heavy releases have been plentiful, Capogreco considers Tron: Ares to be a standout, concluding, “Tron: Ares is so different and out there. It’s not something you can say you’ve seen much of before, and I think that’s a feat for ILM to be proud of as a studio. I’ve been catching up on movies, and they all seem to have similar themes. That’s what’s quite cool about this one – it’s bananas. Celebrate it!”

Read more about the ILM Art Department’s work on Tron: Ares here on ILM.com.

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

ILM.com is showcasing artwork specially chosen by members of the ILM Art Department. In this installment of a continuing series, four artists from the San Francisco, Vancouver, London, and Sydney studios share insights about their work on the 2025 Disney production, Tron: Ares.

Senior Visual Effects Art Director Alex Jaeger

(Credit: Disney).

For this piece the goal was to portray the Encom data hub fortress, but also give it a brighter almost daytime look. The challenge was to figure out what the sky looked like and the overall details and forms in both the surroinding fortress and the distant background and “waterfalls,” etc.

I looked at some of the landscapes created for Tron: Legacy [2010], combined with some looks developed for Tron: Uprising [2012-13] and combined them with more complex details for the foreground structures and more simplified forms for the distant structures. One main goal was to develop a brighter look to the Encom Grid, so there was a bit of a nicer and brighter difference between this and the Dillinger Grid. 

One specific detail that had not been worked out yet was how to portray the sky and clouds in this brighter world. I came up with several looks, including this one, where there is a grid in the sky that the clouds intersect and mingle around, giving it some sense of depth, and providing a cool lighting effect where the grid lines light up the clouds where they touched, and they also aided in showing some movement into the sky.

Supervising Art Director Jason Horley

(Credit: Disney).

The idea for this piece was to show a virtual vehicle being physically printed in the real world. When I was creating different versions of this image there were several that came before this final concept, all balancing how dense the 3D printing support structures were, and how not to reveal too much of the tank that was being printed. Because the tank is a large heavy vehicle, I also had to make sure the support structures looked strong enough to hold such a heavy vehicle. Earlier concepts were much lighter and you could see the tank through the supports, but it didn’t look like they would physically work. 

Once I received the brief, I researched current 3D printing technology to see what the physical process is like, and then scaled it up and added the printing lasers to include an interesting light source that would help add an extra layer of visual interest. 

I found it interesting that when looking at 3D printing, normally the 3D print is the actual point of interest, but in this case, the interest was actually the 3D-printed supports that normally get thrown away. That informed the design decisions. So there is a beauty in those formations that usually get overlooked.

Art Director Igor Staritsin

(Credit: Disney).

This was a design exploration for a ship in the style of classic Tron world from the 1980s. My task was to keep it fairly simple and in the language of the rest of the environment. I did multiple versions of the ship design, exploring ideas and shapes that would best fit the brief. Besides the development of the overall shape, I needed to design it in a positive and negative look in its color, as it would change shape while responding.

I am a firm believer that, more often than not, “less is more.” There is a term in traditional art that I like to use “complex simplicity”, meaning you don’t really want to make it so simple that it looks boring, but you also don’t want to make it too complex so that it looks chaotic. It is a balancing act to find the sweet spot. In this case, it was relatively easy to follow it.

It was certainly interesting to try to play, and mix different low polygon shapes, and sometimes come up with unexpected results. Nowadays, sometimes there is a tendency to give too much control to “play aspect,” with all the modern tech available. I believe whether you come up with a good idea by accident or intentionally, a good artistic judgement is necessary either way.

Art Director Cody Gramstad

(Credit: Disney).

This image was designed to illustrate the process of transformation from the past’s aesthetic to the modern world’s. This was achieved using a gradient across the scene, transitioning from screen left to right, which defines layers of change between the original Tron aesthetic and its contemporary counterpart.  

The process involved taking a modern office space available on location and simplifying all its assets to align with the technical limitations of the 1980s. Subsequently, those key assets were replaced and blended using Tron lines until they evolved into the high-fidelity assets of the real world. High-frequency cubes and digital detail were used to help smooth the transition in complexity between the two states.

The image’s development involved a two-stage iteration process. The initial stage focused on integrating 1980s Tron world design details. The challenge was to clearly introduce these features without compromising the overall compositional goals. For the Tron design, we concentrated on achieving a level of detail that felt appropriately simple yet visually distinct for the space. The second stage addressed the compositional design, which was initially too complex. The combined visual information from both visible worlds and the particle laser effects created an overly complicated scene. We resolved this by employing several techniques: gradating all secondary information out of the Grid space, reducing reflection clarity, and ensuring the lines retained sufficient value contrast for clear reading. Finally, we used this same gradation effect to soften the right side of the frame, allowing Ares to be seen as clearly as possible.

The most rewarding part of designing this image was managing the sheer chaos. With such a busy scene, the challenge lay in simplifying and ordering the elements, essentially solving a complex visual puzzle to bring structure to the multitude of details.

See the complete gallery of concept art from Tron: Ares here on ILM.com.

Learn more about the ILM Art Department.

Take an extensive deep dive into ILM’s creation of the original X-wing fighter miniatures for Star Wars: A New Hope.

By Jason Eaton

(Credit: ILM & Lucasfilm ©).

50 years ago, in an industrial park far, far away, an unassuming team of young artists and craftspeople created something that would capture the hearts and imaginations of an entire generation, becoming an icon for decades to come: the X-wing fighter. 

Seemingly against all odds, these men and women would work in what was initially a bare-bones environment. It was a warehouse with no air conditioning in a lackluster part of town. It would bake in the summer with little shade. There were hardly any trees, just taco stands and aviation supply shops. Cinderblock buildings with no labs, institutes, or studios nearby. And yet, with no fanfare, and against all odds, magic was made. 

A roughly two-foot miniature made of acrylic, styrene, metal, and resin, the X-wing was the on-screen vessel that carried heroes to triumph. It moved impossibly before our eyes against surreal pinwheeling star fields, giving a sense of desperate energy. For myself, a child at the time, it would become a totem for my imagination, never really leaving my brain, to one day push my curiosity as an adult: What exactly were the models made from, and how did they come to be?

I collected various die-cast and plastic toys and built the model kit of the X-wing. I was captured by a design that was both rugged and sleek. Building that first model of the X-wing was not only an enjoyable pastime, but it also began the building of skill sets, and it was a way to imagine myself as an ILM modelmaker. I found incredible joy in building impossibly realistic-looking ships that inadvertently created our modern Star Wars mythology and redefined a multigenerational, visual science fiction “style guide.”

The onset of the internet facilitated a fan community with a shared curiosity about ILM and its creations. I found my tribe through gatherings, parties, and, eventually, ILM artisans’ living rooms. My curiosity transformed into a personal mission to preserve and record the nuanced details behind these Star Wars miniatures – specifically the processes, dimensions, and stories that inspired me as an artist. The X-wing was not just a focus, but an artistic and intellectual obsession.

ILM modelmakers with a number of their creations. L to R: Joe Johnston, Paul Huston, Jon Erland, David Jones, Steve Gawley, Dave Beasley, Lorne Peterson (Credit: ILM & Lucasfilm ©).

Origins: Colin Cantwell

Memory alone is understandably an unreliable narrator, and over the last 50 years, the storied history of the X-wing has been both apocryphal and often contradictory. Focusing on period photographs, paired with the recollections from people who were there for production, we can establish a more comprehensive understanding of the X-wing’s history. I always value the time and generosity of those helping me on my journey. Sadly, many of the artists and incredible people who were part of the production’s journey are no longer with us, and my sense of urgency about this “mission” has only grown stronger. The photographic record is a window into this incredible time, when no one realized how they were changing the world of entertainment. Each photo contains little glimpses of magic being made in an extraordinarily unassuming environment.

Colin Cantwell was the first to put glue to model kit, creating a series of prototype designs very early in 1975 from George Lucas’s thumbnails and descriptions. It was fascinating to talk with Colin about the X-wing, as his mind saw objects and concepts in uniquely creative ways. His model was built from the body of a Jeb Allen’s “Praying Mantis” top fuel dragster model kit, with wings hinged at the center rear of the fuselage that would spring open in an “X” shape. He said he imagined it as a fast craft and as a Wild West gunslinger. The “X” was analogous to the quick draw of a pistol at high noon.

The Cantwell concept models were unsuitable for filming, however, as internal armatures were not included for support or mounting to production equipment, and the details were too fine for the blue screen compositing system being developed by ILM at the time.

Colin Cantwell’s prototype X-wing fighter model, built in 1975 (Credit: ILM & Lucasfilm ©).

Refining the X-wing Design

What was the launch point for the iconic ILM X-wing fighter design that we know today? Enter art director Joe Johnston, who worked with the modelmakers and technicians to bring what was internally dubbed “Project 504” to life, with the first unfinished “hero” example being completed in December 1975. “Hero” meaning a filming miniature that would have the best fit, finish, and all of the adornments needed for specific shots, as opposed to a “pyro,” which was a simpler construction built for pyrotechnic detonation. But before we talk about blowing up models, let’s back up a little to the early fall in that industrial park in Van Nuys.

From a series of photographs and an internal document tracking the progress of each project with names assigned to tasks, David Beasley carved the X-wing fuselage as a “buck” from wood. This first prototype fuselage appears to be made from a top and bottom shell, both in vacuum-formed styrene. There was an internal armature made from machined aluminum by Grant McCune, with David Grell assisting, with the wings being made from a combination of machined acrylic and sheet styrene. Motors and some, if not all, of the electrical wiring are in place.

Steve Gawley with the ILM-built X-wing prototype (Credit: ILM & Lucasfilm ©).

Notable landmarks of this prototype include the half-circle engine intakes we see in Ralph McQuarrie’s paintings from the time, as well as a much sleeker underside rear fuselage. The nose was a different shape, as this build followed the idea that the midline of the body would contour slightly upwards to meet the nose, as depicted in the McQuarrie painting. Most importantly for the design process, this initial series of parts shows that the back half of the lower fuselage is cut away, as the internal armature and motors would need more room to inhabit the shells. David Jones recalled this being a running design change. A careful examination of any hero X-wing will show that quite a bit of material needed to be cut from the sides of the fuselage as well, with long slots to allow the wing brackets to travel. This is very apparent when the wings are closed, but nearly invisible when these models have their wings open. It is a quirk of the hero design that is rarely noted or seen. 

The next series of photographs shows a “proper” hero build in progress, now with a resin top shell, fully enlarged vacuum-formed lower shell, full-circle-shaped engine intakes, and the various details and engines being glued in place.

The hero build in progress, with Lorne Peterson (left) and Joe Johnston. Johnston holds a model part against the fuselage, which will be trimmed down to make a side mount cover (Credit: ILM & Lucasfilm ©).

The Original Model: Blue 1

In December 1975, Lorne Peterson, Jon Erland, and Steve Gawley were working on wings and detailing three days before Christmas. Joe Johnston painted the model with Gawley. The model is Blue 1, the first X-wing ILM would build. It features a blue paint scheme instead of red, no cockpit or pilot figure, no droid, and no working electronics. But it is photographed in this state and shipped to production designer John Barry at Elstree Studios in England on December 26, to be used as reference for blueprinting and constructing the full-sized X-wing fighter set piece on a sound stage.

The first Hero X-wing built, Blue 1 (Credit: ILM & Lucasfilm ©).
Credit: (ILM & Lucasfilm ©).

The images above appear to show Blue 1 and the Red Y-wing (also sent, in an incomplete state to England), along with a clay mock-up of a pilot and canopy/cockpit. This sequence of shots seems to show possible angles and focal lengths to inform the eventual build of the 1:1 cockpit sets that actors would be filmed in.

This model will be returned to ILM in the new year, and Grant McCune will finish the electrical plumbing. The model will be completed by breaking the canopy framing to allow a cockpit and a pilot to be inserted. The frame is somewhat restored with a styrene strip, and the model is redressed as the Red 2 Hero. This will go on to be the miniature that is photographed by Richard Edlund and composited with a TIE fighter behind it, featured on thousands of lunchboxes and promotional materials.

The 1977 Star Wars lunch box features the “Red 2 Hero” X-wing model, originally Blue 1, the first to be completed (Courtesy of Pete Vilmur).

The “Hero” X-wing

So what makes a Hero X-wing a “hero”? And what is a pyro? For the sake of clarity, let’s start by focusing on miniatures made for the first film. There were four Hero X-wings made, with a fifth unfinished example that appears to have stayed with the model shop through the production of the original trilogy. The four X-wings are painted to represent Red 1, Red 2 (formerly Blue 1), Red 3, and Red 5. Each fighter model is bespoke. Detail elements were patterned using styrene, acrylic, and various pieces from model kits, which were all then molded and cast, such as the rear butt plate, top “droid strip,” nose, droid, the two-piece cockpit, pilot, and elements of the laser cannons.

A Hero X-wing with armature, plumbing, and wings in progress (Credit: ILM & Lucasfilm ©) .

The armature was round metal stock, threaded at either end. This rod ran from nose to tail with the threaded holes providing the front and back mounts. Situated underneath the astromech droid is an octagonal-shaped block with threads on the top, sides, and bottom. Set screws are always present in one of these mounts, serving to anchor the octagonal block to the main armature rod. Behind this block sits the “scissor” mechanism for the wings, which consists of brackets that hold two motors in place along the underside, with the brackets ending with “L” shaped metal that serves as the main surface to affix the wings. 

The motors each have a toothed gear that sits against a larger brass central gear parallel to the octagonal mount, and it appears that the motors, when engaged, would “crawl” along the surface of the central gear (which did not move) and this would open or close the scissor mechanism. At some point, someone added screws to the wing mounts with a rubber band stretched between them, which aided in the wings’ opening. It appears that the motors may have been underpowered to pull the wings open reliably or smoothly. Dennis Muren recalls that the motors worked smoothly when he filmed the models on stage, and agreed that the rubber band would have provided tension in the mechanism.

A look at the inner mechanisms of the Hero Red 5 X-wing, with a stripe greyed over to appear as Red 4 (Credit: ILM & Lucasfilm ©).

Plumbing made from surgical tubing was used to distribute cool air to the hot lights used at the time, and an electrical wiring loom was also created. Four sets of six wires were positioned at mount points – front, back, port, and starboard. These wires were capped with brass female pin plugs, and are very visible along the sides of the miniatures just aft of the canopy, and on the underside of the Sherman tank detail on the butt plate. These wires would provide power to the motors and the lighting to the four bulbs inside the engines at the exhaust. 

GE aircraft indicator bulbs were utilized for the engine lighting and were most likely sourced from Kasper Sales across the street on Valjean Avenue, according to Paul Huston. The bulbs are seated behind Aavid heatsinks (remember, the lights used at the time would be incandescent and would become hot), and in the center of the heatsink, a circle of hand-cut red lighting gel is glued to give the color to the engines that you see on screen. Curiously, the same bulbs are found inside the laser cannons, torpedo tubes, and, from an examination of Red 5, the cockpit. Muren doesn’t think that these lights were ultimately utilized, and the supposition for some is that they would be keys for the rotoscope artists to follow when animating things like the laser fire.

Pin plugs on the side of the Hero Red 3 X-wing (Credit: ILM & Lucasfilm ©).

The wings on the Hero builds were constructed from machined acrylic and styrene. The outer face of the wing starts with an acrylic “box” that mounts to the armature’s “scissor” mechanism with two bolts. This box sits atop the main wing itself, which is made from 1/16th acrylic on the top and sides, and .040 styrene for the inner facing wing. This creates a hollow along the length of the wing. The wing box is dressed with a large Holgate and Reynolds HO Scale brick sheet panel, which is long out of production and prized by contemporary Star Wars model makers as it also appears in small rectangle chips elsewhere on X-wings and other filming miniatures. 

The acrylic box at the base of the wing also serves as the main mount for the engines, which are made from half of a 1/144 Saturn V rocket’s third stage and half of an engine bell (large and small sizes), 1/32nd Phantom turkey feathers and engine halves (cut up, reassembled, dressed with kit parts and in some cases molded and cast), Aurora Sealab pieces, styrene, and acrylic. The back half of this assembly has a machined metal tube with those aforementioned Aavid heatsinks inside. This metal tube is encased by the Phantom engine halves, which were patterned and cast (and curiously, in the case of Blue 1/Red 2, are uniquely patterned). This top wing ends with a mount for the laser cannons, which is a cast resin plate. 

As it turns out, no two Hero X-wings were built exactly the same, so plating details and even the Saturn V pieces will vary from wing to wing and miniature to miniature. Even the small engine bell halves inside the engine’s “intakes” will have different pieces on the same model.

Some of the unique details on the Hero X-wing builds (Credit: ILM & Lucasfilm ©).

Laser cannons are made from a cylindrical assembly (cast resin in at least two heros) that features unique chip/panel detailing, back caps that are recessed to varying degrees on each build, and a cannon made from telescoping brass tubing (and in one instance, machined acrylic). The tips have a small resin-and-styrene “emitter” assembly, with the smallest-diameter brass tube creating the tip. On Red 1, there is solid red acrylic inside the laser cannons. On Red 3, a length of fiber optic material pokes out slightly. Both, we assume, would be lit by the bulbs inside the laser cannons, which sit inside the two stacked Aavid heatsinks on each cannon. These cannons are mounted on the wings with 3/16th brass tubing, with the wiring from their bulbs traveling down the tube mount, and through the wing plate hollow described below.

A look inside Hero Blue 1’s inner wing box’s hollow (Credit: ILM & Lucasfilm ©).

The inner wing plate features two cutouts. Plastruct “C” channel, styrene, and model kit parts were added to suggest the idea of mechanical detail. While the wing is oriented in this position, it would be easier to see the rest of this mechanical detail sitting in the wing boxes’ hollow. There are two detail plates from the Sealab inside this box, and they seem to sit opposite each other in each wing, top to bottom. The majority of the box’s space is filled with another Phantom engine half, either a patterned casting or a section of the styrene base part that was not used elsewhere. 

Surgical tubing is visible here. It starts at the end of the metal tube inside the engine assembly up top, travels through the Saturn V’s third stage half, and emerges below into the wing box hollow, punching out of the Phantom half to become visible, and then directly into the body. Hero birds have one or two “cages” from the Sealab, adding to the idea that this is a functional engine area. Half of a Sealab’s air tank piece caps it all off and hides much of the surgical tubing and bolt heads.

Reflective tape applied to Hero Red 5’s lower right wing (Credit: ILM & Lucasfilm ©).

Interesting, lesser-known wing features include a chamfered edge on the top of each wing’s leading edge. This has been somewhat obscured on Hero Red 1 by the reflective tape applied to these models on the leading and trailing edges of the wings. This tape shows hand-drawn pencil lines suggesting panel lines, a technique seen in some areas on these models, including the rear portions of the engines and the rear of the angled box that the engine assemblies rest on. 

In speaking with Dennis Muren and Richard Edlund about the challenges in the revolutionary blue screen and compositing process they were refining at the time, both explained that when a model moves in certain ways and creates motion blur, thinner areas are prone to disappearing as the lighting drops off. So for the X-wing, this reflective tape was an attempt to ensure enough light would define the wings to keep them from disappearing during the compositing process. Muren also recalls experiments in which they would intensify the front lighting on a model, as a more dynamic movement would be programmed into it. He said they tried many things, and then analyzed the next day during dailies. If it didn’t work, they would abandon the idea.

Red circles denote the areas where puttied-over screw heads hold the top fuselage shell to the armature beneath (Credit: ILM & Lucasfilm ©).

The top shell, as stated before, is a cast resin. An examination of a surviving unused casting features faint panel-line engravings, an indentation for the “Droid Strip” to be placed behind the cockpit, and a solid back wall. The reverse shows two thicker rectangular stand-offs on the inside-facing surface that aided in affixing the top shell to the armature, with two screws puttied over. Over time, these puttied circles have, in some cases, fallen away, revealing screw heads along the top of the fuselage. 

The wall thickness of this casting would be seen when the back wall was cut away to create a lip around the inset butt plate resin casting, as well as the wall thickness of the canopy bracing, which was made by cutting and filing material to the etched lines that described them – essentially what would be canopy glass, but in this instance, negative space. This explains why the back window on each of these models was uniquely shaped, as there was no surface etch to follow. 

The cockpit was two cast resin pieces, with a pilot made from a figure from a 1/24th Harrier model kit, modified to feature a futuristic helmet. Then arms from other 1/24th kits were used, which again makes identifying the source kits a bit of a scavenger hunt for a researcher. The pilot castings were used in both hero builds (painted in multiple colors) and Pyro builds (painted a solid dark primer grey), with the cockpits being solid dark primer grey in both versions. Neither reflected the designs of the 1:1 sets and props seen in the live-action footage.

The lower fuselage’s rear wall thickness tapering, due to its vacuum-formed nature. Also note more pin plug access holes at the rear mount area (Credit: ILM & Lucasfilm ©).

The lower shell is a vacuum-formed piece of styrene that was hand-scribed, with the side torpedo tube openings hand-cut and located by cutting brass tubes at angles and grafting them into place. Curiously, at the ends of these tubes, you will find those small incandescent bulbs. The handmade nature of these landmarks means that every one of the hero X-wings has unique panel lines, as well as unique “chip detailing,” which are small rectangular pieces of .010 styrene placed along wing and fuselage surfaces to help catch the light on set. 

The vacuum-forming process creates a thinner-walled piece the deeper the draw is, meaning that for these lower shells, the wall thickness tapers to a very thin edge at the bottom rear of the fuselage. This is a signature look unique to the Hero X-wings. A nose cone finishes off the fuselage, which was cast in one piece and grafted into place in the front, with the gaps puttied and contoured over. Each nose was sliced in two at a slightly different spot, creating a permanently attached mount area on the body, with the separated front cone acting as the front armature mount cover. The nose cone has a proud cylinder and two brass locating pins that fit into corresponding holes on the front of the fuselage. When removed, the six power port female pin plugs can be seen encircling the hole to access the armature threads, three on each side.

These models could initially appear very bright (Credit: ILM & Lucasfilm ©).

Adding Detail

These models were all painted using automotive paints, Floquil enamel railroad paints, and primers from companies such as Nu Finish. Much has been said about the process over the past fifty years, and a fair amount of it can be contradictory. It wasn’t until a conversation I had with Dave Jones in the late 2010s that the proverbial light bulb clicked on, when he made an off-hand comment about how (paraphrasing) “everyone was experimenting and doing their own thing with the painting.” 

That one observation neatly explained why some were painted with black primer and some with grey primer. Some were base-coated with Hot Rod White, which is a creamy warm white, and some were base-coated with Reefer White, a brighter white comparatively. This also neatly explains why Red 1 is so bright overall, and why Red 3 was warmer, even “muddier” in places. 

Floquil made two very similar red colors, and it appears that both were used to make the stripe details. Lorne Peterson and others have described using Scotch-Brite and/or sandpaper for processes such as sanding back finishes to reveal the dark primer underneath, masking fluid applications for “chipping” detail, and varying the color on wing and body panels. Other processes included masking areas with tape and painting diluted washes; adding decals from Micromark (and other kits the shop used for miniature construction); applying dry transfers of small rectangles in black and dark grey with hand-cut frisket masks; and airbrushing misted coats and streaks.

Everything was used to varying degrees to create these unique models akin to the WWI Flying Circus. Each X-wing had recognizable painted landmarks, which deepened the “used universe” look and feel that made Star Wars as a whole feel so “real” to the viewer.

Canopy glass is present on these Hero X-wings on the filming stage (Credit: ILM & Lucasfilm ©).

There was even an attempt to include “canopy glass” in some shots of the models on stage. Peterson remembers using slide glass, and it appears there are remnants of a tinted gel material on the surviving Hero Red 1. Some stage photos show this canopy glass, and if you look closely at any surviving X-wing, you will see remnants of adhesive or missing paint from where these panels were removed. It is usually mistaken for weathering, as it indeed looks the part.

Lorne Peterson building a Pyro X-wing, with Jon Erland to the left (Credit: ILM & Lucasfilm ©).

The “Pyro” Models

With the complexities of the X-wing Hero fighters, how did ILM then simplify the process for the Pyro models? In an interview after the film’s release, McCune stated there were nine Pyros made. These would have to be easier and quicker to make, given financial and time considerations. In speaking at length over the course of multiple years with Dave Jones, he explained about the master pattern he helped make for these Pyro versions of the X-wing. 

Returning to the photographic sequence of events, the Pyros were an all-hands affair, with Lorne Peterson, Paul Huston, and Dave Jones constructing and assembling the Pyro builds and a “first wave” of Pyro X-wings painted by Joe Johnston, featuring a level of care and sophistication that was given to the hero builds. These well-painted Pyros were photographed by Richard Edlund and serve as a wonderful preservation of some of the work that was quite literally destroyed to get the pyrotechnic shots. 

Many have described that the initial Pyros did not result in the explosions they wanted for the shots. Later Pyros would have to be made and would be more hastily constructed. Infamously, at least one, had “sticks” as stand-ins for the laser cannons, which made it into a final shot. It can still be seen in the current version of the film (look for Red 10’s demise for the shot).

A set of almost complete pyro pieces, with Steve Gawley’s sunglasses (Credit: ILM & Lucasfilm ©).

The Pyro master pattern was created by taking a Hero top and bottom shell, mating them with the torpedo tubes, and grafting in a nose cone. The wing root area was fitted with a fixed wing mount, constructed in the open “X” mode, and then panel lines and chips were applied. This assembly was then cut vertically, so that the body was now a left and right shell. Dave Jones then poured liquid resin into the inside of this pattern to strengthen everything from behind. The thinking being that the weight of the wings would help to quite literally pull the body apart when they exploded. 

Wings were constructed similarly to the Hero wings, but any undercuts were filled in on the underside, and the wings overall were slightly shorter (in the area where they ultimately attach to the hull). Laser cannons and engine assemblies were in brass, but also constructed and cast as simpler, more complete assemblies, and then everything was molded and cast.

A pyro X-wing featuring brass cannons (Credit: ILM & Lucasfilm ©).

This meant that every Pyro X-wing would lock in the locations of the nose, the torpedo tubes, the angle of the “X” wing deployment (they are noticeably a wider splay on the Pyros), the panel lines, and the chip detail. The body is overall shorter than a Hero and surprisingly skinnier as well, as there was width lost to the kerf of the physical saw blade used to cut the body in two, and then again but to a lesser degree when assembled, as the cast halves would have had their mating surfaces sanded slightly once cast, to remove any irregularities/flashing, or to ensure a flat plane. Some seem to have been constructed without a butt plate casting, and seemingly backfilled with foam so they could slide onto a C-Stand. On stage, they found that there would be better results if the bodies were pre-scored to break apart into smaller pieces, and surviving fragments will sometimes show these odd zigzag shapes carved into the castings.

Keen eyes will observe some of the same ships represented in their Hero and Pyro forms on this table, with John Dykstra (Credit: ILM & Lucasfilm ©).

Over the years, photographs have helped identify these pyros. Red 1, Red 3, Red 4, Red 6, Red 10, and Red 12 have been identified. Red 1 closely followed the Hero’s paint job, but at some point was partially detonated, and then was repaired and partially repainted. Curiously, two more stripes were added to it and the Hero, to make it appear as Red 3. Red 3 bears little resemblance to the Hero Red 3, and Red 6 seems to have been detonated and then donated some of its parts, becoming Red 10. The stripes top out at 6 on the wings, and for 10 and 12, ILM simply took a thin length of additional tape when masking for paint, and cut the 6 bars in two, so that when painted, they became shorter stacked markings. 

Many pieces of Pyro X-Wings have surfaced over the years. Some have been positively identified as one of those mentioned above. But many pieces have no photographic reference, which perhaps speaks to the speed at which the later Pyro builds were created. 

These Pyro molds (and castings pulled after the film, which were themselves used to mold and cast again) are where all “lineage” post-production castings originate, including the first licensed replica created by Icons in the 1990s. When compared to a production pyro casting from 1976, dimensional shrinkage will be noticeable, making the Heros look massive in comparison! 

These kinds of details and observations are what drive people like me to learn as much as possible about the filming miniatures of Star Wars. Every new piece of information about the construction of these filming miniatures would encourage me and others to refine recreations of the X-wing miniatures, and there is no greater satisfaction as a modelmaker than to see how close the builder community has gotten as we have continuously improved the builds over the last twenty-five years. Part archaeology, part artistry, and part friendship, the journey to get to the center of these models has been illuminating and gratifying. 

Grant McCune at work on an X-wing (Credit: ILM & Lucasfilm ©).

The amount of heartfelt thanks to individuals is massive, but I would be remiss in not specifically mentioning Lorne Peterson, Dave Jones, Bill George, Dennis Muren, Paul Huston, Jon Erland, Gene Kozicki, Richard Edlund, John Goodson, Sean House, Hiroshi Sumi, Ed Minto, Craig Underwood, Bryan Babich, Mike Salzo, Dave Mandel, and my supernaturally-cool-with-this-stuff wife, Lisa Eaton. You can find a few X-wing replicas I have made here: www.jasoneatonstudio.com.

Releasing January 13, 2026, the landmark book is now available for pre-order.

By Lucas O. Seastrom

Although Ian Failes was surprised when offered the chance to write Industrial Light & Magic: 50 Years of Innovation, he’d seemingly been preparing for it much of his life. 

Growing up just south of Sydney, Australia, Failes was an avid fan of ILM productions, including the Star Wars and Indiana Jones series, the Back to the Future trilogy (1985-90), and Jurassic Park (1993). But it was 1994’s Forrest Gump that piqued his curiosity about the visual effects craft. Watching a behind-the-scenes documentary included with the film’s VHS release, Failes became enraptured with the process of how these visual effects marvels were created by the ILM team.

“I saw plenty of films growing up, but I didn’t really understand that visual effects was an industry,” he tells ILM.com. “As you see this behind-the-scenes material, you realize that people actually work on this stuff. It was towards the end of high school that I started to be obsessed with visual effects, and ILM in particular.”

Failes admits “I don’t think I’ve reread a book as much as I have Into the Digital Realm,” in reference to the 1996 publication about ILM by Mark Cotta Vaz and Patricia Rose Duignan. That book was the second in an ongoing series showcasing ILM’s story, which has also included Thomas G. Smith’s The Art of Special Effects (1986) and Pamela Glintenkamp’s The Art of Innovation (2011). Failes’s own 50 Years of Innovation from Lucasfilm Publishing and Abrams Books carries the tradition forward with its release in January 2026.

A Self-Taught Storyteller

“Those early ILM books were huge parts of my formation as a visual effects journalist, but also just to spur on my interest in visual effects before that,” says Failes. His current vocation as founder and editor of the visual effects publication, befores & afters, was inspired by his passion for reading about the art form in magazines like the iconic Cinefex. Studying law and information technology, he began his career as a lawyer while blogging about visual effects on the side. 

“In some ways, I was more obsessed with the journalism of visual effects than visual effects itself,” Failes notes. “But I am in Australia, and back then, I didn’t think it was possible to cover the visual effects industry from here. I would go to work as a lawyer, come home, and blog more about visual effects, conversing with people on the internet. Then I started doing interviews, usually at five or six a.m. from Australia, and transcribing them from a tape. Over time, it was clear to others, and then eventually to me, that that’s what I was passionate about. It wasn’t clear how I could have an income and a career from this. But then I started to find ways to do it.”

Initially working for the publication fxguide, Failes is largely self-taught both as a journalist and in his knowledge of filmmaking craft. He authored the book Masters of FX in 2015, which included interviews with a number of ILM visual effects supervisors like Dennis Muren, Scott Farrar, and John Knoll. He soon founded befores & afters as his own independent outlet. “Celebrating the artists” is his chief priority.

“Personally, I think visual effects is an art form that doesn’t always get its due,” explains Failes. “My mandate with befores & afters is simply to report how a movie, a sequence, or a shot was created. I’ve found that it’s a really good antidote to some of the discourse online, which can include misinformation about how things were made. If I can report it accurately, then my goal is to be the source for accuracy in the visual effects community, not that anyone else isn’t doing that. But I really want to try and rally against the misinformation. The artists do such incredible work. They’re a big part of these huge films that we get to watch.”

A sample spread from Industrial Light & Magic: 50 Years of Innovation (Credit: Lucasfilm & Abrams).

Crafting ILM’s Narrative

“I don’t think I’ve ever said ‘yes’ faster than when I was asked to write this book,” Failes recalls with a laugh. ILM’s 50th anniversary “crept up on me,” as he says. “Visual effects companies don’t normally last that long. This isn’t a common thing.”

In discussing the book’s story with ILM’s leadership and publicity team, Failes worked to identify the company’s many eras, each full of creativity and transformation. ILM didn’t reach its 50th anniversary without constantly embracing changes in technology, filmmaking trends, and an expanding, global industry. The company itself has played a significant role in shaping that industry, making ILM’s story a 50-year history of the visual effects art form in itself. In the end, Failes is pleased with the resulting book in which readers “can see the progression of work and changes over the years,” as he explains.

“There are great stories about people embracing change,” Failes says. “There were artists who were practical modelers or painters, and they realized that they needed to move into the digital realm to keep their jobs. As those individuals had to adapt, ILM as a whole has had to as well. The good thing is that ILM has jumped on big changes all the time. Digital was one of those, but virtual production is another one, with ILM StageCraft and how that’s been used on The Mandalorian [2019-23] and elsewhere.

“What interests me is that these changes were not brand new inventions,” Failes continues. “There are all these nice threads in ILM’s history linking a past development to what they’re currently doing. The virtual production work has its roots in older rear projection methods, and ILM has dabbled with this kind of thing a lot over the years. So many different tools and techniques come together. Because ILM is a place full of innovation, they can put the best and brightest onto these projects and make them happen. So I hope that when people read the book, they can see that we’ve connected some of those threads together.”

A sample spread from Industrial Light & Magic: 50 Years of Innovation (Credit: Lucasfilm & Abrams).

It’s All in the Details

Conducting a number of original interviews for 50 Years of Innovation, Failes was able to go deep into some long-held questions, such as the origins of digital compositing in the late 1980s and early ’90s. “Many of us know about these three huge films – The Abyss [1989], Terminator 2 [1991], and Jurassic Park [1993] – and the incredible innovations achieved in CG modeling, animation, and rendering of different creatures,” the author says. “But an enormous part of why they worked was because of digital compositing. 

“I got to chat with [visual effects supervisor] Dennis Muren about this topic,” Failes continues, “things I haven’t read much about before, including the methods created for ingesting film and outputting digital images back out onto film, plus the actual compositing approach. The book has allowed me to fill in some of these details that I didn’t know.”

Throughout the book, many sidebars expand on various details from specific tools and methods to characters and films. “Members of the ILM Publicity team contributed to these sidebars,” notes Failes. “They’re incredible gems of information that haven’t really been talked about elsewhere.”  

Pulling it all together are upwards of 1,000 images from the ILM archives, something Failes considers to be the book’s marquee feature. “The images involved a team of people at ILM spending countless hours poring through image libraries, scanning 35mm slides and negatives, and then captioning all of them. The images really make this book what it is, a coffee table book that you can flip through a thousand times because you’ll see so many cool things.”

A sample spread from Industrial Light & Magic: 50 Years of Innovation (Credit: Lucasfilm & Abrams).

Supporting Voices

ILM’s founder, George Lucas, contributed a brand new foreword to 50 Years of Innovation that captures the spirit of the intrepid company in its earliest years. Filmmaker Bryce Dallas Howard also penned an introduction, bringing not only her lifelong admiration of ILM’s work, but her direct experience collaborating with the team as a director on The Mandalorian and other Lucasfilm series. “She’s really familiar with ILM’s process, in particular StageCraft,” adds Failes, “so we have a filmmaker’s perspective.”

ILM’s current senior vice president and general manager, Janet Lewin, penned the book’s afterword. Lewin helped to craft the book’s story and perspective. “Janet herself has a long history with ILM,” notes Failes. “She is a champion for artists and many of the changes ILM has had to go through over the years. It’s important for her to be a big part of the book.

“Janet really cares about how artists feel inside the company,” Failes adds. “That’s a bit of a secret weapon of this book. It’s not just about the films and shows that the company has worked on, but it’s also about what it’s like inside ILM. What’s the culture? Why has this studio persevered for so long? It’s there in the book, and that comes in part from Janet’s influence.”

A sample spread from Industrial Light & Magic: 50 Years of Innovation (Credit: Lucasfilm & Abrams).

Capturing the ILM Spirit

Over five decades and some 500 productions across film, television, theme parks, and interactive and immersive experiences, ILM’s story leaves a lot to cover, perhaps even too much for any one volume. It’s something Failes laments on behalf of his fellow visual effects fans. “That will always be in the back of my mind,” he explains, “and I apologize to those who were looking for a major discussion about a film that isn’t in there. We can’t cover everything, so maybe we’ll just have to do another [laughs].”

The number of ILM shows has particularly swelled in recent decades with the company’s global expansion that began some 20 years ago in Singapore, and now extends to studios in Vancouver, London, Sydney, and Mumbai. Having visited a number of ILM’s locations, Failes is impressed by the company’s ability to maintain its culture between each studio. “You feel like you’re at ILM at each one. That’s not always easy to do. The book showcases ILMers from these different parts of the world.”

As the industry’s oldest visual effects studio, ILM is also arguably its best known, in part as a result of its part within Lucasfilm and association with Star Wars and George Lucas. But that’s not the whole story, according to Failes. 

“ILM’s notoriety comes from constantly proving their skill and innovation,” the author says, “as well as their collaboration with filmmakers. Visual effects is a service industry, but it’s also an art form. Film and TV are amazing ways of combining lots of different art forms, and ILM has a seat at the table, and that comes through in all of their amazing projects.”

A sample spread from Industrial Light & Magic: 50 Years of Innovation (Credit: Lucasfilm & Abrams).

From Pirates to Matte Paintings

Whether they’re experienced artists working in the visual effects industry, or brand new to the craft, readers of 50 Years of Innovation will encounter a menagerie of iconic characters and spellbinding visuals, from Yoda and E.T. to the Hulk and Davy Jones. The latter pirate captain has always been a favorite of Failes. “There’s something about the combination of Bill Nighy’s performance and ILM’s digital character work led by John Knoll and Hal Hickel. It’s so watchable, and it stands out in the book.”

Another personal highlight for Failes is the chance to write more about the infant Sunny Baudelaire from A Series of Unfortunate Events (2004). “When that film came out, so many people didn’t realize that in certain scenes she’s completely digital,” says Failes. “Who knows how ILM pulled that off [laughs], though we do cover it in the book, of course. The integration of that character is so effective. The big films are in the book, but people might be surprised at some of the smaller projects in there that are big moments in their own right, and that includes Sunny Baudelaire.”

One of the most “magical” elements for Failes is the traditional matte paintings created by ILM artists in the photochemical era. “A lot of people who’ve just come to visual effects cannot quite believe that it was actually painted on glass or some other surface. Sometimes it’s a full painting in the final film, sometimes it’s a painting integrated with live action, and sometimes it’s a moving shot. The fact that that kind of work could be done pre-digital still just blows my mind.”

All of this and much, much more is to be found in Industrial Light & Magic: 50 Years of Innovation, from Lucasfilm Publishing and Abrams Books, and written by Ian Failes. The book is available everywhere January 13, 2026.

Pre-order the book now from Abrams, Bookshop.org, Amazon, and Barnes & Noble

Read more from Ian Failes at befores & afters.

Lucas O. Seastrom is the editor of ILM.com and Skysound.com, as well as a contributing writer and historian for Lucasfilm.

The actor who voiced and supplied the motion capture performance for Grakkus the Hutt opens up about his work on the mixed reality playset.

By Jay Stobie

Bobby Moynihan on the motion capture set (Credit: ILM).

Star Wars: Beyond Victory – A Mixed Reality Playset immerses users in the Star Wars galaxy, crafting an interactive experience that can be explored through three distinct game modes – Adventure, Arcade, and Playset. Beyond Victory is currently on sale on Meta Quest 3 and 3S headsets, and ILM.com had a chance to catch up with actor Bobby Moynihan, who voiced and provided the motion capture performance for Grakkus the Hutt. From brushing up on Grakkus’ origins in Star Wars comics to describing his collaboration with Industrial Light & Magic, Moynihan shares his insights on creating the mixed reality playset.

Grakkus the Hutt, a crime lord whose muscular frame sits perched atop cybernetic legs, made his debut in 2015’s Star Wars #9. An avid Star Wars fan, Moynihan was aware of the character prior to taking on the role in Beyond Victory, telling ILM.com, “I knew of the character from the comics. I knew he had robot spider legs. I knew he had a lightsaber necklace.” Once he discovered he’d be bringing Grakkus to life, Moynihan dove deeper into the lore surrounding the Hutt. “I bought the comics and read them,” Moynihan states before jovially reflecting on his early impressions of Grakkus by quipping, “I thought I was not nearly as jacked as I needed to be.”

Concept art of Grakkus the Hutt by Stephen Zavala (Credit: ILM).

When it came time for his motion capture collaboration with ILM, Moynihan describes the process in a way only he can, explaining, “I got to do motion capture sitting in an office chair on wheels being pushed by a man with a mask on. I highly recommend it if you get the chance. It was very cool to see it in real time. It felt very Star Wars. I was in heaven. The most challenging part was not peeing for a long time because of the jump suit. And even that wasn’t that bad.”

Asked how he determined the manner in which Grakkus would move, Moynihan declares, “That’s the best part. I don’t have any legs so the extremely talented animators helped me a great deal. But the upper body… that’s all Moynihan.” The actor’s praise for the ILMers doesn’t stop there, as he reveals what made working with ILM and Beyond Victory director Jose Perez III so special, relaying, “I love working with people who adore what they do. Who really truly love the universe they are working and creating in. Jose is the best!”

Grakkus the Hutt (Credit: ILM).

On a lighter note, Moynihan jests about the traits he feels he infused into his Grakkus performance, teasing that his contributions include, “Lumbering movement. Slight back pain and male pattern baldness.” When it comes to Moynihan’s favorite Grakkus attribute that had already been established in the comics, the actor focuses on the collection of Jedi lightsaber hilts the Hutt wears around his neck. “That saber necklace is a STATEMENT PIECE,” Moynihan proclaims.

As much as he delighted in voicing and supplying the motion capture performance for Grakkus the Hutt, Moynihan’s enthusiasm for Beyond Victory now takes the form of enjoying the mixed reality playset from a player’s perspective. “I have been having an absolute blast playing this game. It’s so fun to be fully immersed in a universe you love,” Moynihan asserts. Summarizing his reaction to seeing Grakkus in the final release, Moynihan emphatically concludes, “Grakkus. Is. Huge.”

(Credit: ILM).

Star Wars: Beyond Victory – A Mixed Reality Playset is currently on sale on Meta Quest 3 and 3S headsets.

Read more about Beyond Victory on ILM.com.

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Avatar: Fire and Ash, Jurassic World: Rebirth, and Sinners are among the selected films.

The Academy of Motion Picture Arts and Sciences has announced the shortlists for productions in 12 categories at the upcoming 98th Academy Awards, including Best Visual Effects. Industrial Light & Magic contributed to all 10 films shortlisted for the category. The films include:

Avatar: Fire and Ash

The Electric State

F1: The Movie

Frankenstein

Jurassic World: Rebirth

The Lost Bus

Sinners

Superman

Tron: Ares

Wicked: For Good

Voting for nominations will take place between January 12 and 16, and subsequently announced on January 22, 2026. The 98th Oscars will be held on March 15, 2026 in Hollywood. A huge congratulations to our ILM teams from around the world!

Our colleagues at Skywalker Sound have also contributed to 5 out of 10 films shortlisted for Best Sound, including Avatar: Fire and Ash, F1: The Movie, One Battle After Another, Sinners, and Superman.

Read more about the Oscars announcement here.

And discover more about ILM’s work on The Electric State, Jurassic World: Rebirth, The Lost Bus, and Superman right here on ILM.com.

ILM.com is showcasing artwork specially chosen by members of the ILM Art Department. In this installment of a continuing series, three artists from the San Francisco and Vancouver studios share insights about their work on the 2025 Warner Bros. production, Superman.

Art Director Chris Voy

The premise for the Luthcorp escape shuttle was that it would first be seen by the audience as a command center/observation deck at the center of the skybridge between the towers of the Luthcorp building. During the escape it is revealed that the room doubles as an aircraft detaching from the structure and takes flight.

When we started on it, the overall design of the building and shuttle had been established. My task was primarily to work out the details and mechanics of how the two structures would fit together and work in a convincing way. I provided a range of iterations and options to the client but always tried to maintain the elements of the design they liked. It was a challenge to make the shuttle first believable as architecture and then as an aircraft without the former spoiling the later! 

The client provided concise tasks and great feedback with reference that made it a fun sequence for us. At one point they asked to see how the shuttle would detach, clear the tower and the engines would deploy in time without everyone inside falling hundreds of feet. I mocked up a few rough animations where we played around with timing and different ideas of how it might detach from explosive hard points like a fighter jet canopy or something. We experimented with a few variations on the idea until they were happy and could see how all the elements would work together.

Senior Art Director Alex Jaeger

I jumped into this project late to help out with some designs, one of which was the look of the Engineer’s “nanite tech,” and how it took from one part of her body to form another part. The challenge was to come up with believable solutions for where to take away, and how much, and what would the resulting areas look like. 

My goal was to create a balance of removing enough material from her silhouette to create the weapons, while leaving enough in the right places so she still felt strong and capable of holding up the new weapons. I had also played with whether or not to reveal some sort of understructure or skeleton in areas.

While researching various looks, I came across some great minerals in nature that exuded a great level of design within chaos. They show how, even in the depths of the Earth, there is a design presence in the smallest of details.

Senior Concept Artist Evan Whitefield

The concept captures a defining moment that shows who Superman truly is, someone who puts saving lives above everything. In the story, he holds up a collapsing building until every last person is out of harm’s way, letting it fall on him to keep them safe. Then, through the rubble and dust, he rises. It’s the moment we see the cost of his sacrifice and the unstoppable strength of his compassion.

The composition and scale were established with a wide, low-angle shot, placing Superman small against the vast, ruined building to emphasize the enormity of his sacrifice and the weight of what he had endured. Clouds and smoke were added, carefully scaled to match the figure, reinforcing the sense of vastness. Light and atmosphere were refined, shifting from dull gray smoke to a warm, golden palette of dust and heat, tones that conveyed both the chaos of the collapse and the radiance of hope breaking through it. Finally, subtle rays of light were added, with one more pronounced, elevating the iconography as it pierced the haze and cast a long shadow framing his upward, determined stance. The effect felt almost celestial, transforming the small, central figure into a symbol of resilience and hope rising from the ruin.

Inspiration for this piece merges classic comic book iconography with modern film cinematography. It draws on the emotional power of Superman’s emergence from overwhelming danger and the idea of his strength being renewed by sunlight.

See the complete gallery of concept art from Superman here on ILM.com.

Learn more about the ILM Art Department.

The ILM visual effects supervisor reflects on ILM’s contributions to the riveting film inspired by a compelling real-life story.

By Jay Stobie

Inspired by the events of the 2018 Camp Fire in Paradise, California, and based on Lizzie Johnson’s 2021 book “Paradise: One Town’s Struggle to Survive an American Wildfire,” The Lost Bus (2025) follows Kevin McKay (Matthew McConaughey) and Mary Ludwig (America Ferrera) as they attempt to shepherd 22 children to safety aboard a school bus during a chaotic evacuation. Directed by Paul Greengrass (The Bourne Ultimatum [2007], Captain Phillips [2013]), who co-wrote the screenplay with Brad Ingelsby, the film puts the audience in the front seat on a harrowing ride through the awful inferno.

Industrial Light & Magic’s Dave Zaretti (WandaVision [2021], Willow [2022-2023], The Acolyte [2024]) generously took time to sit down with ILM.com and discuss his role as the project’s ILM visual effects supervisor. In a wide-ranging conversation, Zaretti touches on studying real-world references, crafting intricate assets and environments, paying respect to those who endured the tragic events depicted in The Lost Bus, and much more, detailing every level of ILM’s extensive contributions to the captivating film.

(Credit: Apple & ILM).

Real-world References

As the ILM visual effects supervisor, Zaretti operated out of the London studio and oversaw the company’s work on the project. “I think ILM had the lion’s share of the visual effects work for the second part of the film,” Zaretti tells ILM.com. “I supervised all of the London work, and we had work from ILM’s Mumbai studio that [ILM associate visual effects supervisor] Steve Hardy supervised. My daily role was to keep an eye on the overall vision of the show and check in with Charlie Noble, the client-side visual effects supervisor.”

Noble supplied ILM with what Zaretti describes as a nearly two-hour-long “megaclip” of reference material, which helped ILM ground their visual effects shots in realism. “I’ve never had so much reference on a show,” Zaretti emphasizes, “because this event occurred in 2018, and everyone had phones and cameras. It was a very filmed event. A pickup unit also went to Paradise and filmed loads of additional footage of the actual place itself. In terms of the environment and road layouts, we spent a long time on Google Maps ensuring we got all of the turns in the road correctly.”

ILM broke up the reference megaclip into separate asset types. “We had shots within the smoke cloud, shots outside of the smoke cloud, big fires, small fires, brush fires, trees burning, houses burning, and cars burning,” Zaretti recalls. “There were fire tornadoes; I didn’t even know that was a thing! We were spoiled with references to the point where the references began to contradict each other because there’d be wind blowing in two different directions in the same piece of footage. Those contradictions actually helped us in certain scenarios. For example, during the escape sequence when the bus is winding down this thin road, there are points where we had flames licking at it from both directions to portray the danger that they were in. So, having a real reference to back up your visual effects often helps.”

(Credit: Apple & ILM).

Assembling Assets

The abundant reference footage proved incredibly useful; although, its varied nature also meant ILM needed to build a wealth of digital assets to fully represent the range of material. “We had about seven types of fire assets,” Zaretti says of the meticulous process. “You’d think fire would count as one thing, but no, we had small, medium, and large fires. We had traveling fire that was needed to show the spread of fire through the grass. There were several types of smoke because vegetation has a lighter smoke, while houses and cars have a darker, roiling, inky smoke.” Surprisingly, minute embers seen amidst the chaos are burning or smouldering pine cones, reflecting the heavy debris encountered throughout the 2018 fire.

“We built a CG bus, as well as the digital cars we used on the multilane Skyway. We added to the cars they already had to create a sense of everything being jammed together, with people nudging and jostling.” Though a practical bulldozer was captured on location, the ILM team lobbied Paul Greengrass to supplement a fully digital version, which Zaretti notes “looked fantastic. Our team at ILM’s Sydney studio built that for us.”

ILM constructed large environments for the project, such as Roe Road and the Skyway. “We had to build all of that. There were other environments here and there, too. We had to augment Pearson Road’s falling power cables. Paul [Greengrass] did some practically, but we added more to get a chain reaction of the cables coming down and whipping around. We did nine or ten species of trees. We had variants of each species, and for each variant, we had different strength winds going through several levels of fire. We must’ve had hundreds of tree variants in total, as well as bushes and grass. One of the biggest technical challenges was the scale of the event,” Zaretti advises.

“When it came time to design the shots, we already had these component parts and could drop them in. Then, when we needed to do a hero simulation for something in the foreground, we could simulate that,” Zaretti continues. “Our effects team was amazing, and Billy A. Copley was our effects lead. [ILM digital artist] Matthieu Chardonnet devised a really cool fire setup where, once we had a look that we liked, we could essentially paint where we’d want the fire to be on the trees. Within a few hours, we’d have a first pass of how it’d look. Normally, doing that work would’ve required a couple of days turnaround.”

(Credit: Apple & ILM).

Dynamic Details

Zaretti and the ILM team considered their visual effects work as another tool the director could utilize to tell this immensely important story. “Despite the fact that some of the shots are perhaps 90% visual effects, they were always supporting what had been filmed. We viewed the fire, smoke, and embers as a character,” Zaretti explains, noting that Paul Greengrass felt the ember cam shot was the film’s “Jaws moment.” “The fire is coming for you, and there’s this impending doom. It is the protagonist in the film – it’s the baddie.”

The scene in question finds the camera weaving through the forest from the fire’s perspective, as the smoke and embers close in around the trapped bus. “That was a big operation. The nature of Paul Greengrass’s dynamic camera shots meant that you never focused on anything for too long. Suddenly, you come to this calm, slow, and long piece of track material through the woods. It was entirely digital because the filmmakers wanted to completely art direct where they went. We had an environments build for our forest, which we used for the rest of the show, but we increased the resolution for this particular shot and simulated all of the grass,” Zaretti declares.

Incorporating the original 2018 fire conditions increased the scene’s realism and sparked a somber thought for the ILM artists. “The reason the event was so devastating was because the extreme winds made the fire spread, so we always had to convey a sense of high-directional wind by simulating the grass being blown. The environment looked so good, and we wanted to show it off, but it needed to be dark. We had to bring it down and use the distant firelights to illuminate it. We wanted it to feel realistic, which was enough to make it terrifying. It’s difficult because I kept getting excited about our shots – but then I reminded myself that this actually happened and must have been absolutely horrifying,” Zaretti states.

(Credit: Apple & ILM).

Unexpected Undertakings

ILM’s contributions to The Lost Bus consist of numerous tasks that casual audiences might not typically associate with visual effects work. When McConaughey’s character exits the bus to survey the fiery landscape surrounding it, ILM trained its efforts on a surprising feature. “The scene needed to be windy, but they didn’t have the wind machine on set at that point,” Zaretti remembers. “So, we rotoscoped the beautiful curls in Matthew’s hair and did a comp shake on those to suggest the wind was stronger than it was.”

For the sequence where the bus embarks on its final escape down a narrow road, Zaretti’s ILM artists assessed key factors relating to the fire’s intensity. “If you have too much fire, you’re just going to have a big sheet of orange. As a viewer, you need to see identifiable objects in there to give it some composition. We took a little creative license to guarantee that the audience would know what they’re seeing. Where’s the bus? Where’s the danger? We also played with the density of the smoke to allow shots to be revealed as you go along,” Zaretti divulges.

Even the digital trees were painstakingly reviewed by Zaretti and his ILM colleagues. “At the end, we started on our ‘tree assassination’ rounds. Usually, the procedural winds we added in turned out spectacularly, but they’d occasionally cause a tree to appear a bit suspect,” Zaretti reports. The ILMers searched for and removed any trees that didn’t pass their quality control test, verifying that every tree that landed on-screen behaved naturally.

(Credit: Apple & ILM).

Detecting Daylight

The Lost Bus reaches its cathartic crescendo when the bus finally exits the fire, traversing the edge of the smoke cloud, and emerging into the safety of daylight. “That was a neat shot, and very big. It was driven purely by reference because we had footage of people driving through smoke as it goes from black to a slightly lighter shade. Then you get a hint of daylight coming through until suddenly it’s daytime and everything seems normal,” Zaretti remarks.

“That shot was an interesting one to put together, to transition from one environment to another, as we did end up needing to go all digital. The daytime section was the same environment we built previously, but because it’s bright out, we had to confirm all the details, like the tree simulations and so on, were there. We put in the CG bus windshield with the wipers, and we had a simulation for the embers. The bus was splatting into the embers, and due to the speed of the bus, the embers sort of floated to the side and got caught underneath the windscreen wiper. All of these details helped convey the sense of travel.”

(Credit: Apple & ILM).

Committed to Collaboration

As the ILM visual effects supervisor on The Lost Bus, Zaretti notes that he had the honor of overseeing every ILM shot and deciding whether they were up to the company’s high standards. However, Zaretti reserves the bulk of his praise for his artists, pronouncing, “When all the work is presented in our dailies, I’ll sit in a room and see a shot for 30 seconds, and I’ll already have people from production tapping their watch to say I have to move on and keep the pace up. [laughs] The people who really know the shots and contribute are the artists that are doing them. They’ve sat there for two or three days to make the shot beautiful, and then they present it to me.”

Of his review process, Zaretti affirms, “My artists know their shots better than I do, and I’m just looking at a snapshot in time. I have a gut instinct, and I’ll say what I see, but I enjoy it when other people have opinions because I’m but one person. I cannot see all of the things, all of the time. I value having such an excellent team around me that sometimes helps me spot obvious things that I’ve missed. I’m only human, so I encourage it to be a team effort. This is how we make great work, because we’ve got great people.”

Zaretti is also keen to point out the contributions of CG supervisor Jamie Haydock, a fellow member of the ILM London studio. Haydock, as Zaretti explains, was essential to creating all of ILM’s visual effects shots. “His technical knowhow and level headedness kept everything ticking along so well, and he technically enabled the FX artists to let loose and be creative with some very tricky manipulation of our still fresh USD [Universal Scene Description] pipeline.”

(Credit: Apple & ILM).

Remarkable Reactions

Eight months after completing his work on The Lost Bus, Zaretti and his wife had a chance to see the finished film at a premiere. “It was one of the most stressful things I’ve ever watched, and my wife felt the same way,” Zaretti reveals. “At no point was I inspecting the visual effects, because you’re just in the story. Paul Greengrass has a documentary style that doesn’t allow you to stop and pause and analyze. It’s not too much or over the top – it’s just storytelling. It’s a perfect example of supporting visual effects doing what needs to be done – and doing it well.

”Speaking of the film as a whole, Zaretti proclaims, “I am immensely proud of The Lost Bus. Firstly, because of the work that we did. Secondly, I do think we’ve respected the event itself and what happened that day. Thirdly, it’s a really good film. Visual effects aside, it’s a nice change to work on something that’s so grounded. When I’m working on a fantasy project, it doesn’t have the degree of gravitas that this one did. This was a terrible event. People lost their lives, and the people who survived will never forget what’s happened. It was important that we did it justice.”

(Credit: Apple & ILM).

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Members of ILM’s visual effects team discuss their cutting-edge approach to crafting a scene from the Emmy-nominated Skeleton Crew through their real-time rendering pipeline.

By Jay Stobie

Industrial Light & Magic’s visual effects capabilities have been synonymous with innovative approaches since the company’s inception, as ILM creatives regularly transform theoretical techniques into groundbreaking developments that emerge as everyday solutions. One such forward-thinking application involves ILM’s use of real-time rendering to present their work to visual effects supervisors and client filmmakers in an immersive fashion that allows them to immediately incorporate the feedback they receive.

This process was utilized in “Zero Friends Again,” the sixth episode of Star Wars: Skeleton Crew (2024-25), for a sequence depicting Fern (Ryan Kiera Armstrong) and Neel (Robert Timothy Smith) as they ascend a ladder above the snowy plains of the planet Lanupa. A roundtable of ILM team members, including real time principal creative Landis Fields, environment supervisor Andy Proctor, technical artists Will Muto and Kate Gotfredson, and lead compositor Todd Vaziri, joined ILM.com to chat about trying out a new real-time workflow to craft the depth-defying ladder-climbing sequence in Skeleton Crew.

(Credit: Lucasfilm & ILM).

An Interactive Overview

“In traditional visual effects, artists show their work in dailies as a 2D-rendered movie and get feedback,” says Andy Proctor. “They work on those notes, re-render it, and present it again, usually the next day.” However, real time permitted Proctor and his colleagues the chance to incorporate some of their earlier virtual production workflows to achieve an interactive review process. “I was in the virtual art department on Skeleton Crew, and we would do dailies where the key creatives would be in VR headsets while looking at these environments. To build the actual sequence, we had the entire thing set up in real time for the creatives to view on a normal screen.  All the plates were loaded, and the blue screen was keyed out in real time. It’s essentially the same as when the visual effects supervisor is looking at a regular 2D review, except the whole scene is live.”

The planet Lanupa environments for those Skeleton Crew reviews were drawn from material originally created for ILM StageCraft’s virtual production pipeline, as Todd Vaziri highlights, “There are extended sequences where the actors were filmed for this environment on the ground level that were built by our StageCraft crew. There were already rounds of art direction, design, and construction, and that had to be approved by the filmmakers before first unit filming began with the actors. They would get in-camera finals using all of our StageCraft LED technology and real-time rendering technology. All of that work, especially on the creative side, had been done.”

Roguish Roots

Looking back to the origins of these real-time elements, Proctor points to ILM’s proof-of-concept contributions to the creation of K-2SO in Rogue One: A Star Wars Story (2016), also overseen by Skeleton Crew production visual effects supervisor John Knoll, as “a technical milestone for real-time visual effects,” which had usually been reserved for games and interactive projects during that time period. “You skip ahead to Skeleton Crew, and now you’ve got much more of a crossover, because we’re using real time to design the environments and work out how they’re going to be shot.”

The Mandalorian’s (2019-23) season three finale followed Rogue One as the next benchmark on the path to the real-time process harnessed for the Skeleton Crew cliff climb. “[Skeleton Crew] represented the natural progression of working with [executive producer] Jon Favreau on The Mandalorian, because we were always pushing the boundaries,” Landis Fields notes, as ILM’s use of StageCraft’s LED-based volume prompted them to lean into virtual production techniques for a variety of different disciplines. “On the volume, you have a real-time environment, and it’s working for in-camera finals. So there’s already a step towards what you’re doing in real-time being what you’ll see in the final picture,” Proctor chimes in.

In The Mandalorian “Chapter 24: The Return,” ILM chose the astromech droid R5-D4’s descent into the cavern housing Moff Gideon’s (Giancarlo Esposito) secret Mandalorian lair to exercise the most recent real-time advances for the scene’s final pixel shots. “We had done that years ago on some of the K-2SO shots for Rogue One, but real time had changed a lot since then,” Fields adds. “On The Mandalorian, we were able to test integrating real-time visual effects reviews with [visual effects supervisor] Grady Cofer and [animation supervisor] Hal Hickel.” Instead of simply giving notes for changes that would be made at a later date, the supervisors could quickly see the impact of their requests for lighting changes and other alterations.

(Credit: Lucasfilm & ILM).

A New Scope

ILM’s ability to successfully demonstrate the viability of that real-time process was met with an immense wave of support, as Fields credits Jon Favreau, John Knoll, head of ILM Janet Lewin, and Lucasfilm’s Rob Bredow, for being strong proponents of continuing on the cutting-edge course. Nevertheless, Fields emphasizes that this approach was intended to be one of many tools on which they could draw, as the choice of which technique to pull from ILM’s ever-growing arsenal of production pipelines would always come down to “the right tool for the job.” While the majority of the StageCraft volume LED in-camera work for Skeleton Crew was done using ILM’s proprietary Helios renderer and engine, this particular sequence was an opportunity to also see where the use of real time could be pushed and leveraged in novel ways.

Perceiving The Mandalorian’s season three finale as a major real-time stepping stone, Proctor recalls that ILM elected to expand its use to an even greater extent, as he posits, “Now, we’re going to take a sequence and cut it in among live action that was shot in the volume and other traditional visual effects that are rendered offline. It has to match the other sequences and be as visually complex as everything else.”With this real-time production workflow firmly in place when ILM’s work on Skeleton Crew commenced, Proctor recalls the ladder-climbing shot from “Zero Friends Again,” saying, “We knew it was an important moment, because it establishes that Neel is scared. Kate Gotfredson was able to set the shot up in real time so we could do a dynamic height or vertigo wedge.”

This arrangement enabled them to consult with John Knoll and ILM visual effects supervisor Eddie Pasquarello in real-time, experimenting with a variety of elements, from pushing the background forward and away to tweaking the lighting. Speaking to the capacity to review several shots in a row in a single cut with per-shot interactivity, Vaziri adds, “We had a mini-editorial cut with works in progress. Being able to show an entire sequence to the visual effects supervisors and saying, ‘Yeah, this is how it’s going to look, but we can interactively move things around and instantly see in the context of the cut,’ that’s a game changer right there.”

Praise for the Process

The benefits of utilizing a real-time production pipeline are as diverse as the galaxy far, far away that ILM has built on screen. “Real time is very flexible. We were able to develop custom real-time compositing tools very quickly using blueprints, which allowed us to preview the live-action footage directly on top of our environments. With these tools, we could experiment with framing, set dressing, and lighting with immediate feedback,” Kate Gotfredson outlines.

Will Muto offers his appreciation for the interactive workflow, surmising, “You get more bites at the apple. The creatives are able to iterate more and hone in on what they want. That’s where the power is here.” Muto applauds ILM’s real-time process for its smoothness, continuing, “There were no huge surprises. We added tooling around our color pipeline to apply our shot grades within the real-time minicut, so we were certain that artists working real time were viewing plates in the exact same context that the compositors would be viewing downstream. The showrunners [Jon Watts and Christopher Ford], John Knoll, and Eddie Pasquarello all got what they wanted in extremely short turnarounds.”

Fields echoes the praise for the collaborative efficiency, relaying, “In the traditional pipeline, having a review is not just jumping on a call. There’s time that an artist has to dedicate to preparing material to review.” With real-time sessions, Fields divulges that his team can simply “throw together a meeting, and everyone joins the call. There was no prep other than that I had to be at my desk.”

(Credit: Lucasfilm & ILM).

ILM’s ‘Personnel’ Touch

Proctor articulates an unexpected advantage that has a habit of emerging from those video calls, when his team could share a screen and jump right into a real-time session. “You get more moments of serendipity. When someone is showing their work, you can see what they’re doing live and interact with it yourself. By doing that, we’d get these ‘happy accidents,’ where someone was playing around with the water shader and hit a parameter that made everyone say, ‘That looks amazing!’ because suddenly the water felt incredibly translucent. Those collective learning moments happen all the time, and it’s very difficult to get that any other way than in real time.”

The human element also factors into another attribute unique to ILM, as the unprecedented level of professional experience concentrated within the company’s ranks allows its personnel to maximize the real-time workflow’s potential. In terms of oversight, ILM’s senior staff have the expertise to recognize how their colleagues’ talents could be leveraged for optimum efficiency. “Where do you want the masters of these crafts to spend their time? Andy and I were very keen on paying close attention to who was doing what,” Fields remarks. “It’s not about everybody doing everything. That’s another part of working within the pipeline and being smart about the division of labor.”

Applying environments created for the volume in the real-time review process gave some ILM artists the chance to work across multiple stages of development, as well. Once Proctor had finished with the set design alongside his colleagues in the virtual art department, content creation supervisor Shannon Thomas oversaw the creation of the final environment used in the StageCraft volume. Digital artist Nate Propp contributed to both the volume build as well as the real-time work in post-production, for which Proctor then returned to help oversee.

“Not only was Andy familiar with the worldbuilding exercise that he had done,” notes Fields, “but he was familiar with it from the ground-level perspective.” When the real-time crew began on the ladder shots that looked down on Lanupa’s surface, they could rely on Proctor’s insights from his role in crafting that environment. “Andy knew what the environment was supposed to look like,” Vaziri agrees, indicating that his work as the lead compositor involved the important task of exposure balancing for the foreground which meant “a lot of rotoscoping, compositing for the extractions, a tiny bit of effects renders for some blowing mist that went through the environment, a lot of stock stuff that the compositors put in. From our perspective, we had to deal with very few renders overall, which I absolutely loved.”

The Legacy Ahead

“Back in the day, when they did K2, that was about the ‘if.’ Skeleton Crew wasn’t about if we could do it. We knew we could do it,” Fields summarizes. “To be able to see that sequence in the cut, scrub, and get valuable notes that are efficient with the time from the visual effects supervisors was huge. The review was essentially a three-dimensional, real-time composite set that we could move around.” Perhaps the greatest testament to the real-time process is the response its use garnered from individuals throughout the company, as Fields shares, “Everybody outside of our phase, the other departments downstream, started to see the value here.” This particular real-time workflow has joined ILM’s illustrious array of visual effects pipelines, becoming yet another evolutionary tool to be called upon when ILM is deciding which approach is best suited for the visual effects shot it is tackling that day.

(Credit: Lucasfilm & ILM).

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Read more about Star Wars: Skeleton Crew on ILM.com:

‘Star Wars: Skeleton Crew’: ILM’s Visual Effects Treasure Chest, From At Attin to Starport Borgo

‘Star Wars: Skeleton Crew’: ILM’s Visual Effects Adventure from the Observatory Moon to Lanupa and back to At Attin

ILM’s Enrico Damm, Paul Kavanagh, Stephen King, and Matt Middleton share details about ILM’s role in the director’s first theatrical release from the DC Universe.

By Jay Stobie


Written and directed by James Gunn, DC Studios’ Superman (2025) has leapt to the forefront of the cinematic superhero landscape, aweing audiences with a message that inspires optimism and hope with a side of introspection. Industrial Light & Magic played a crucial part in the visual effects that brought this uplifting story to life, collaborating with Gunn and production visual effects supervisor Stephane Ceretti on their quest to supply a fresh perspective on the legendary character.

ILM visual effects supervisor Enrico Damm (Rogue One: A Star Wars Story [2016], Ahsoka [2023-present]), ILM animation supervisors Paul Kavanagh (Star Trek [2009], Deadpool & Wolverine [2024]) and Stephen King (The Avengers [2012], The Batman [2022]), and CG supervisor Matt Middleton (Mission: Impossible – Dead Reckoning Part One [2023]; Alien: Romulus [2024]) joined ILM.com to chat about ILM’s work on Superman, from taking the lead on characters such as Superman, the Hammer of Boravia, and Ultraman to tackling a large section of the climactic final battle involving a Metropolis baseball field, a deadly interdimensional rift, and fan-favorite dog, Krypto.

Welcome to Metropolis

“I joined Superman early on during pre-production, when [visual effects producer] Susan Pickett and Stef Ceretti approached ILM to talk about how we could build a Metropolis that we could art direct in a real-time fashion,” Enrico Damm explains to ILM.com. “This allowed the production designer and director to creatively iterate until we achieved a layout that would build the foundation for the final, post-production asset. We had real-time sessions with Stephane, [production designer] Beth Mickle, and, occasionally, even James Gunn, where he would be able to direct changes in real time. We established a look that was shared with the previs companies and the other vendors on the show, which permitted us to have everybody start with the same asset.”

Imbuing Metropolis with the living, breathing atmosphere of an actual city without replicating a real-world location was extremely important to Gunn. However, since roughly 70% of the design was inspired by New York, Stephane Ceretti and Susan Pickett authorized ILM to undertake an excursion to gather material. “I spent a couple days in a helicopter over New York City, capturing reference photography,” Damm details. “At the same time, my colleague Dacklin Young was capturing backdrops.”

CG supervisor Matt Middleton would go on to rely on Damm’s city references, explaining, “Our environment team at ILM’s Sydney studio did a large amount of the Metropolis city build. We knew this asset would need to be shared between vendors, and it was strongly based on the New York City references. We built Metropolis in sections, including various hero sections for our work. We had hundreds of unique buildings, which gave Metropolis a great organic feel. That was driven by what Enrico had done before the shoot had started. Many times, we might only build 10 buildings and do variations of them, but on Superman, we utilized an enormous number of distinct buildings to avoid a procedural feel.”


On-Set Observations

Damm’s duties intensified even further once filming got underway. “I was on set for pretty much every ILM-related shot,” he says. “I made sure that ILM would get what was necessary for our work, and I assembled things that could help us, which weren’t part of principal photography. On the side, I gathered motion tests and scans of his cape and suit, essentially grabbing David and the on-set visual effects crew for an hour to film him in the suit and see how the folds move,” Damm elaborates. “During the shoots of ILM sequences, I met with David almost daily and, with client-side visual effects supervisor Stef Ceretti, we developed a specifically-designed scanning system to gather data for ILM FaceSwap training.”

Since David Corenswet portrays both Superman and Ultraman in the film, this technology was useful for the moments when the two characters face off with each other. “The system helped us capture David’s face as faithfully as possible, so we could train a system to replicate him and FaceSwap in those moments. Almost every day, we held sessions with David to go through various lines to ensure that we captured every nuance of his performance, so we would be able to replicate it digitally on his stunt doubles.”


Suiting Up with Superman

ILM’s crucial involvement continued off the set and into post-production. “The client put their faith in ILM and had us build Superman, the Hammer of Boravia, and Ultraman, three characters who we see frequently,” Damm notes. “We even had the pleasure to have David Corenswet visit ILM’s San Francisco studio. Being a huge Star Wars fan, David had quite a blast. We scanned him using our proprietary MEDUSA scanning system to recreate him on-screen, which included a full digital replica with muscle, bone, and cloth systems. We’re dealing with an ultrabeing flying at ultraspeed, so we did a great deal of cloth simulation on his cape, suit, and hair to portray an appropriate sense of speed.”

ILM’s animation supervisor, Paul Kavanagh, is based at the San Francisco studio, the hub that oversaw the entirety of ILM’s work on Superman. Speaking to the prevalence of ILM’s digital Superman replica, Kavanagh says, “A lot of the times when you see Superman flying along in his suit and cape with his hair fluttering, the only thing that wasn’t CG was his face. But everything we did was based off of the live-action shoot, and we were very faithful to it. We weren’t making up a whole new shot; we were simply enhancing what was filmed.”

Meanwhile, operating out of ILM’s Sydney studio, ILM animation supervisor Stephen King was brought onto the project at the beginning of post-production. “As an animation supervisor, my job is to collaborate with Enrico and Paul to make sure that we’re creating the vision that’s coming from James Gunn and Stef Ceretti,” King remarks. “I helped establish the movement of Superman. The animation department was responsible for enhancing David’s performances by taking away the sense of him being on the rig they had filmed him on. We made certain that his body performance didn’t feel like he was on a rig – that he was actually flying. When I think of Superman, I think of his incredible strength and his ability to fly, so we needed that to appear as real as possible.”

King praises the cooperation between ILM’s various departments, stating, “Our team at ILM’s Sydney studio was in charge of Superman, creating the digital double that would go hand-in-hand and blend seamlessly with David Corenswet’s performance. For shots where we had to do a fully digital version, we wanted to ground it in reality. Our simulation department took care of his cape in every shot that we worked on, making it move and feel authentic. In many of the flying shots, we had to add digital hair because hair is difficult to recreate on stage. It’s either completely flat and doesn’t move, or a fan is placed in front of the actor and affects their performance by causing them to squint. James Gunn entrusted ILM with the title character. Guaranteeing that Superman shone in our work was of the utmost importance for us.”


Boravian Brutality

Damm also hones in on David Corenswet’s hair, referencing his previous point about the bald cap utilized to film the battle against the Hammer of Boravia. “James Gunn wanted to approach that scene with visual effects to allow us to portray an appropriate amount of speed within the hair and sell how fast these beings are flying. In the Hammer of Boravia sequence, it’s all digital hair. It was a unique challenge because there’s no room for errors. If there’s something off, it would immediately break the illusion,” Damm asserts.

“In terms of the character itself, the Hammer of Boravia was essentially a hard-surface object,” Damm adds. “Since he’s wearing a suit, he was a bit easier than Superman in the sense that we weren’t dealing with flesh. The main challenge emerged when it came to texturing and shading the character, as there’s a significant amount of creative and technical know-how called for to craft the shading response that a metallic object has.”

Damm hopes that audiences are unable to tell which shots necessitated the Hammer of Boravia becoming a digital character, noting, “There was a full-on practical suit in many shots, where the on-set crew filmed him on wires. Certain action beats and acrobatic movements required he either be partially or completely replaced. Even in a handful of close shots, where you might assume the practical version remained, we had to go with the digital version because the story changed after principal photography had finished.”


The Nature of Narratives

As breathtaking as the visual effects of Superman are, Damm and King both emphasize that ILM’s contributions were all done in service to James Gunn’s compelling story. “There’s a sequence where Superman and Lois Lane [Rachel Brosnahan] are deep in conversation, but you have the Justice Gang fighting a giant jellyfish-type creature in the background. We played on the size of the creature so it would be subtle and not moving fast enough to be distracting,” King professes. “Then, when Superman tells Lois that he loves her, the creature spews out all these different colors, and it’s almost like fireworks that enhance the sense of their love and their connection to each other. It’s visual effects aiding in the storytelling, and that’s a credit to James and Stef knowing what they wanted.”

Similarly, Damm highlights the moment the interdimensional rift arrives at Metropolis and begins to split the city, pronouncing, “We were breaking buildings, and there were so many layers of destruction built on top of each other. However, all of that needed to hit precise story beats, meaning the effects weren’t just taking one building and letting it fall into another building. There’s a specific speed and cadence to it that was art directed by Stef and James. Our effects artists received very precise animations of how everything would collapse from the animation department, which were then used to drive simulations.”


A Kryptonian Canine

ILM handled a major portion of Superman’s final battle, as King describes, “We basically worked on everything from when they land in the baseball stadium until they exit the rift at the end of the fight between Superman and Ultraman.” The beloved dog Krypto is a key component in these sequences, and the director utilized his own dog, Ozu, as a template for the heroic canine. “In essence, James’s dog is Krypto. By that, I mean his dog is also very rambunctious and doesn’t necessarily follow the rules all the time. We had various shots where we animated Krypto, so we built muscle and fur systems to make his hair flow appropriately and match James’s dog,” Damm proclaims, mentioning that Framestore built Krypto’s underlying skeleton.

“The reference footage that James sent over was so fantastic, and [Ozu] was such a character,” agrees Kavanagh, who then turns to the shots themselves. “We received Framestore’s shots of the dog animation well before we started on ours, and they gave us a wonderful target to follow. In animation, we’re constantly paying attention to the little things. For example, when the dog’s foot plants, we’re looking at how deep the foot presses against the ground, the squish of the toe pads, the slight spread of the toes, and the angle of the nails. The same goes for how the dog pants and the way its tongue rolls over its incisors. These are all elements that make the character come alive.”

ILM’s contributions to Krypto were concentrated on the dog’s appearances in the climactic battle, and the team recognized how vital these sequences would be. “James puts a lot of thought and love into his digital characters and their performances,” says King, who jokes that he spent time staring at his own dog as part of his research. “Our Krypto sequences were based more on physicality, like when Krypto knocks over Ultraman and starts destroying all the drones, so we got some nice high-energy panting in there that feels very lifelike. As an animator and animation supervisor, it’s the subtle stuff you bring to the character that can make it more realistic, and that’s what we love to do.”


An Epic Engagement

The Engineer (María Gabriela De Faría) and Ultraman stand as other key factors in ILM’s battle scenes. “I loved our time on the Engineer, because she’s got the nanites that empower her to shapeshift and create nanite obstacles that she fights Superman with,” King relays. “It was important to stay true to what they did on set when Superman fought the Engineer, yet give it extra energy.” Comparing the Engineer to a more complex version of the liquid metal T-1000 from Terminator 2: Judgment Day (1991), Damm states that “the Engineer breaks herself down into nanites, so there are really millions of little individual objects that are coordinating to perform a function. Since we see her punching Superman from far away and also get a closeup of individual nanites on Superman’s eye, we constantly adjusted the size of the nanites.”

Damm was on set when Gunn filmed the characters’ engagement in the baseball stadium, recalling, “People were being pulled on wires and landing on mats for protection. While that provides a solid base, you can’t film at the speed that’s required or crash your star actors into the ground! Visual effects had to be added, especially for ground destruction. We all know what it looks like when you rip grass out of the ground. There are many layers to it, and we needed to represent how it separates in a believable way.”

Matt Middleton points to another facet of the “ground” battle, opining, “There was a lot of work in maintaining the continuity of the dug-up trenches. We had to accurately place specific trenches that the characters had previously skidded through into the background of the shots.”

“We thought a lot about ground interaction and how far to stick the characters into the baseball field to demonstrate the force and energy,” Kavanagh concurs. “And when that impact happens, it kicks up dirt, debris, and grass.” Pivoting to what he calls the “up and down” sequence in which Superman and the Engineer ascend into the atmosphere, Kavanagh says, “We had cloud layers for the characters to go through to get a sense of their speed. We also didn’t want to make it too easy for them to move within the heavy wind resistance. We always want to ground our sequences in reality.”

As a clone of Superman, Ultraman’s appearance is tied to the look of the titular hero in a multitude of ways. “Unlike in previous Superman films, Superman’s suit was a little looser,” Damm outlines. “James Gunn has explained that Superman’s mom made his suit, so it’s not something fashioned from super technology or sent from Krypton. Ultraman’s suit was also fairly loose, so we had to go the extra mile. Since we built it based on digital scans taken with David standing in a scanning studio, there were naturally folds present in the cloth. We rebuilt his suit in a way that allowed us to put effects simulations on it, enabling Ultraman’s suit to move properly in the wind and when he was being punched.”


Rumble in the Rift

The interdimensional rift that slices through the city represents another important feature in the third act’s big fight sequence. “It’s unique and almost a living organism,” Damm remembers. “Rick Hankins, who joined Superman early on as the effects supervisor, took on large chunks of that R&D project. We applied various elements into it, such as how metal melts and the crystalline growth of bismuth. We presented hundreds of versions and eventually found a look that was approved.” Matt Middleton adds, “The geometric detail that went into the bismuth was immense, and our goal was to achieve a look that people could believe in, which doesn’t look like a CG fantasyland. Also, the previs that was done by the client was exceptionally helpful because James Gunn knew how he wanted the broad geography of the sequence to come together.”

“In the third act, we spent quite a bit of time inside the rift,” notes Damm. “Being so close to it and having it around us the entire time proved to be very challenging. If the surface qualities of the rift don’t feel like believable metal, the entire sequence falls apart. To represent real bismuth in a meaningful way, we needed ours to have a believable, metallic nature to it, as well as an underlying sheen that goes through rainbow colors. Additionally, when the other dimension opens, and we see a black hole, there’s a ton of heavy effects simulation that goes into having assets breaking and being pulled into the black hole before disappearing.” Kavanagh cites ILM’s insertion of debris elements moving toward the black hole as being the foreground cues that keeps audiences oriented to which direction is “up” throughout that sequence.


An International Effort

As a global studio, ILM’s work on Superman – which Damm estimates to be in the vicinity of 560 visual effects shots – occurred around the world. ILM’s Sydney studio took on more than half the shots, while ILM’s studios in San Francisco, Vancouver, and Mumbai combined to handle the rest. “In Sydney, we worked hand-in-hand with San Francisco in our respective time zones. I would plan our sequences with [visual effects supervisor] Dave Dalley and Matt Middleton, then we’d get invaluable input from Enrico and Paul,” King explains.

“ILM’s San Francisco studio worked concurrently with the client’s time zone as they were based in Los Angeles,” Kavanagh adds. “The time difference can be tough because Sydney is a day ahead. It’s like you’re time-traveling [laughs]. However, it often worked to our benefit. We could give the Sydney studio feedback on a Friday, and by the time we came in on Monday morning, they already had a day to work on the notes and provided new takes for us to show to the client.”

On a grander scale, King interprets ILM’s international presence as a phenomenal sign for the company’s future, commenting, “I came to the Sydney studio when it opened in 2020, and we started with relatively small jobs. It’s exciting for us to have grown so much that our location can take on the end battle sequence of a big summer blockbuster like Superman. ILM opened up new studios in Sydney and Mumbai within the last six years, plus we have the more established studios in San Francisco, London, and Vancouver. ILM is growing, and the work is turning out to be magnificent.”


A “Super” Success

Considering the project as a whole, King is immensely satisfied with his team’s performance on Superman. “As animators, we put so much thought and effort into everything we do. It’s not just sitting at a computer. We use computers as tools, but we often go and shoot references for the shots that we’re working on, so we can study them. We want to inspire young people to have that same love for movies that we grew up with. ILM is the dream job for myself and many people in our industry. At ILM, no one is too small to give their opinion or voice an idea. We create works that stand the test of time, that we can look back on and be proud of.”

Matt Middleton proudly sees Superman as one of the most complicated projects he’s worked on, summarizing, “There were a number of different challenges, ranging from FaceSwap to stadium destruction, as well as the amount of detail that went into hand-cutting out all the little tears in Ultraman’s suit to match the on-set costume. The hundreds of buildings that went into the city, and the hero buildings that have to be built with internal beams that you can see as they get pulled apart. The complexity goes with the territory in superhero movies, but it’s an incredibly intense amount of intricate work spread across all of our departments.”

“lt’s a massive team effort, from the client side all the way down through to the artists and production, but the process of sitting in meetings to work out how we’re going to get the project done is fun,” Kavanagh discloses. “As a supervisor, I’m conscious of hitting deadlines, incorporating changes, and achieving the highest quality look for the client. I also want to make it enjoyable and interesting for the people who are working on the show. Together, we know we’re going to come up with some outstanding ideas and have a terrific time doing it.”

For Damm, one of his favorite moments from Superman encapsulates his appreciation for ILM’s role in the project. “There are plenty of occasions where Superman helps people in this movie,” Damm begins, “But when he saves the woman on the bridge from a falling building, it’s not just that he puts himself in danger to save one person. By showing how important it is for him to save a single individual at any cost, it demonstrates how human he truly is. Afterwards, when it all collapses, he heroically rises out of the ashes. Plus, that shot itself feels as if it’s right out of a comic, with the dust billowing to either side. It was very beautiful to see this shot come together and even more so to see the great reactions from the DC fanbase.”


Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Go behind the scenes of ILM and LIMINAL Space’s ‘Real-Time Rocket’ activation at Disney Accelerator Day 2025.

By Patrick Doyle

Virtual production supervisor Christopher Jones (left) and creative director Michael Koperwas with Real-Time Rocket in the background.

At Disney Accelerator Day 2025, guests witnessed an extraordinary moment where the line between movie magic and real life all but disappeared. In a collaboration between Industrial Light & Magic and LIMINAL Space, attendees met a live, interactive version of Marvel Studios’ Rocket Raccoon, rendered and animated in real time and presented in fully dimensional 3D.

A Rocket Moment Like No Other

The activation, aptly named “Real-Time Rocket,” invited guests to step up to a large, glowing display, the kind of screen that instantly makes you curious about what’s about to happen. After donning a pair of polarized glasses, attendees found themselves face-to-face with Rocket Raccoon, who wasn’t just playing back a pre-recorded clip, he was there, moving, talking, and reacting live.

Rocket leaned forward, smirked and started chatting with the crowd. He cracked jokes. He called out a few attendees by their outfits. He laughed at his own punchlines. The energy in the room was electric as laughter, awe and a collective sense of “how is this even possible?” filled the air.

“What made Real-Time Rocket so special was seeing a beloved character come to life in a way that felt truly spontaneous,” said Christopher Jones, virtual production supervisor of Real-Time Rocket. “You could see the audience forget they were looking at a screen and, for a moment, they were simply having a conversation with Rocket. That’s the magic we aim for at ILM, where performance and technology meet to make something that feels real.”


Making the Impossible Possible

The “Real-Time Rocket” experience was powered by a creative combination of ILM’s real-time animation technology and LIMINAL Space’s Spirit Screen, an innovative display that uses polarized light to create a 3D effect without the need for headsets or holographic projection. Through this setup, Rocket appeared to step off the screen and into the same physical space as the audience, creating a shared moment of astonishment and delight.

“This activation was a perfect example of what happens when creative teams collaborate across disciplines,” said Alyssa Finley, Real-Time Rocket’s executive producer. “Working with LIMINAL Space allowed us to merge our real-time animation pipeline with an innovative display technology that gives audiences a live, shared, dimensional experience with a digital character that they’ll never forget.”


A Room Full of Wonder

As guests filed through the activation, the reactions were priceless. Attendees waved, Rocket waved back, and many leaned closer to see if there was a trick to it. Guests laughed when Rocket playfully teased them about their love of Baby Groot or asked if anyone had seen Star-Lord around. It was equal parts hilarious and jaw-dropping.

Throughout the day it was clear there was something unmistakably Disney happening. When Rocket waved goodbye at the end of each session, the sense of wonder lingered. For a few moments, the technology faded away, leaving only the feeling that audiences had truly met Rocket Raccoon in person.

“Standing in the room and watching people laugh and talk with Rocket was incredible,” said Michael Koperwas, creative director of Real-Time Rocket. “You could feel the energy shift the moment he appeared. Those moments of pure audience connection are what drive us at ILM to keep pushing the boundaries of what’s possible.”

The ILM crew – plus Real-Time Rocket – assembled on the stage.

For more on Real-Time Rocket, check out ILM’s “Behind the Magic” video from the D23 activation here:

Patrick Doyle is a senior publicity manager at Industrial Light & Magic.

ILM creative director David Nakabayashi, along with artists from ILM’s global studios, including Aaron McBride, Cody Gramstad, Bimpe Alliu, and Chelsea Castro, reflect on the essential role concept art and storyboarding play in the filmmaking process.

By Jay Stobie

Concept art for The Avengers (2012) by Aaron McBride (Credit: Marvel & ILM).

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and highlights from Industrial Light & Magic’s 50 years of innovative storytelling.

From envisioning a look for cursed pirates to plotting out space battles, Industrial Light & Magic has an unparalleled reputation for working wonders in collaboration with filmmakers to bring the stories they envision to life. Built on a 50-year legacy of talent and tenacity, ILM’s Art Department has grown into a global hub for generating and executing the ideas that immerse audiences in the worlds they see on screen. In this installment of ILM Evolutions, we’re heading back to the drawing board to focus on conceptual art and storyboarding, as this indispensable imagery fuels the creative process by visualizing a filmmaker’s ideas in the earliest stages of production.

ILM Art Department creative director David “Nak” Nakabayashi sat down with ILM.com to share his insights on the history of concept art and storyboarding, as well as his own first-hand knowledge of the craft. With an esteemed resume featuring iconic films such as The Hunt for Red October (1990), Jurassic Park (1993), Men in Black (1997), Star Wars: The Phantom Menace (1999), A.I. Artificial Intelligence (2001), Avatar (2009), Star Wars: The Force Awakens (2015), and many more, Nakabayashi now oversees art directors, illustrators, and artists across ILM’s global studios. Additionally, artists Aaron McBride (San Francisco), Bimpe Alliu (London), Cody Gramstad (Sydney), and Chelsea Castro (Vancouver) joined the conversation to highlight their careers and the latest developments in their field.

Concept art for Star Wars: The Phantom Menace (1999) by David Nakabayashi (Credit: Lucasfilm & ILM).

Ideas and Intentions

Concept art and storyboards each serve unique purposes in the production pipeline. “Concept art depicts a scene, setting, place, location, robot, spaceship, weapon – they initially come from the script along with a brief description,” outlines Nakabayashi, who emphasizes the extent to which the art helps the crew visualize the project they will be creating. “These are the key beats in the film. Concept art establishes what we’re going to be doing. We’re showing everyone that this is the movie we’re going to be making when absolutely nobody has any idea what it will look like.”

Nakabayashi cites concept artist Ralph McQuarrie’s contributions to Star Wars: A New Hope (1977) as the perfect example of such art having an inspirational impact on a film. “They based everything on his art. It fed everybody’s imagination.” Today, concept artists regularly assist filmmakers as they design and seek green lights for their films. “Sometimes, ILM will do development or spec work, where we take concept art and show the studio what the movie will be with the same intention as Ralph did. We carried that on.”

While concept art focuses on design, storyboards define the action that occurs on-screen. “Storyboards are all about the cinematic motion, the energy of a visual effects shot. That’s why ILM’s Joe Johnston was such a great foundation for this department. He would draw storyboards with arrows that would be compressed in perspective, and you really understood the depth of what he was trying to say,” Nakabayashi notes. Over time, the advent of digital animatics altered the use of storyboards. “We hardly storyboard anymore these days because animatics act as the filler, but it’s the same principle.”

From a broader perspective, Nakabayashi is quick to point to the artistic value of concept art and storyboards. “A pencil drawing, for me, is as powerful as a Ralph McQuarrie gouache painting. I get consumed by the techniques sometimes, and how a person can draw this perfect angle of a little spaceship cruising through the columns of some weird planet. To draw that sequence helps the director make decisions. It’s about visualizing and timing the film before they actually shoot anything, though it’s all changed quite a bit with the whole animatics tool set.”

Storyboards for Men in Black (1997) by David Nakabayashi (Credit: Sony & ILM).

The Importance of Art

“When you look at the scope of what ILM has done,” Nakabayashi says, “obviously Star Wars was a flashpoint for concept art and storyboards because that was the first way of getting creativity into the movie and bringing visual life into the script.”

Of course, the benefits extend far beyond what is seen on-screen. “Art is important for many other reasons,” Nakabayashi explains. “For ILM, it is also about the budgeting process. Historically, the model shop would look at storyboards and concept art and have an idea of what was coming. Production is very budget-driven. ILM would storyboard their sequences, not just for the artistic impression of it, but for logistics and production. That was how a director would communicate with the visual effects supervisor. ‘We’re going to shoot this practical and this blue screen. We can save a lot of money if we do this with miniatures.’”

Nakabayashi boils the work down to its essence, relaying, “It’s about the artists believing that the future is possible and the creation of the cliche ‘Show me something I’ve never seen before,’ which is sort of a byline we usually get from our clients. We can do that because we have the right people – people who take inspiration from the artists who came before them. That’s ILM’s culture of concept art and storyboarding. I’m not a great storyboard artist, but I can communicate and do the work. To me, that is the most important part – communicating the ideas.”

Concept art for A.I. Artificial Intelligence (2001) by David Nakabayashi (Credit: Warner Bros. & ILM).

Communicating Concepts

As an art director at ILM’s Sydney studio, Cody Gramstad (Sonic the Hedgehog 3 [2024]; Lilo & Stitch [2025]) affirms the significance of communication, stressing, “When it comes to being a concept artist, you’re not necessarily there to create artwork. You’re there to clarify and communicate ideas. My favorite part of the job is actually the conversation where I sit down with a bunch of creative people, brainstorm potential solutions to problems, and get everyone amped up as we figure out our creative direction. Painting and visuals are a part of that, but being able to talk, pitch ideas, and get people excited is one of the most important skill sets.”

Gramstad, whose parents were professional sculptors, takes the notion a step further, suggesting that prospective concept artists can bolster their abilities by balancing the dedication necessary to hone their craft with time off for real-life adventures. “Step away from your computer every now and then, have some experiences, meet people, and socialize,” declares Gramstad, placing value on the correlation between communicating ideas and relating to those around you. “It’s a lot easier to work with someone who has gotten out in the world and brings those stories into their artwork.”

Concept art for Lilo & Stitch (2025) by Cody Gramstad (Credit: Disney & ILM).

The ILM Influence

Turning to Industrial Light & Magic’s unique place in the history of concept art and storyboarding, Nakabayashi states, “ILM is special because it all sort of started here. It’s special because of the people who believed and put their foot down – Colin Cantwell, Ralph McQuarrie, and Joe Johnston. There were others on the outside, like Syd Mead and Ron Cobb, all these illustrators who were doing sci-fi stuff, but ILM was the first one to take the visual effects art department and make it something that everybody wanted to be.”

San Francisco-based senior art director Aaron McBride notes ILM’s post-Star Wars permanence as a standout achievement for the company. “Before ILM, visual effects departments would start up for the duration of a film and be temporary. When the film was over, everyone would get scattered. It was almost nomadic,” McBride mentions. “There was a demand for the work that ILM was doing, and ILM was able to advance technologies because it was kept intact.”

By the time of Star Wars: Return of the Jedi (1983), Nakabayashi explains that directors began approaching ILM for innovative films like Poltergeist (1982), The Goonies (1985), Cocoon (1985), Terminator 2: Judgment Day (1991), and many others. From crafting concept art that gave those films their “first breath of life” to “feeding production with ideas,” Nakabayashi views the ILM Art Department’s past with distinction. “Historically, we’ve had some of the best concept artists ever – Doug Chiang, Harley Jessup, John Bell, Terryl Whitlach, Christian Alzmann, and James Clyne. A lot of the artwork that they created determined whether or not a movie was made. That kind of talent, to me, is the greatness of the department.”

As the visual effects art director on The Phantom Menace, Nakabayashi saw connections between his work and that of his predecessors. “We had all this concept art, and part of my job was to bring it into the real world. That’s what the ILM Art Department has always been at the forefront of back to the days of Joe Johnston because he wasn’t even a storyboard artist when he started. He also got into the model shop and built models. He loved making miniatures and setting up the stage. It was kind of the birth of the visual effects art director. We followed along that path. It’s not just doing the drawing or coming up with an idea, it’s implementing it, as well.”


Executing the Ideas

Nakabayashi recalls his experience collaborating with director Barry Sonnenfeld on Men in Black II (2002). “I was tasked to take a trash can and turn it into a killer robot. I liked the idea that it opens up like a flower, and it comes with multiple gun turrets that are not necessarily normally situated in a standard military platform. Maybe it’s more like an orchid. With a few changes, the design went to computer graphics, and I helped develop it in dailies with the modelers, painters, and animators.”

Turning to his time on A.I. Artificial Intelligence, Nakabayashi posits, “Those worlds – Coney Island, the Rouge City, an underwater theme park – were all absorbed through storyboards that Chris Baker did with Stanley Kubrick for a couple years. We started with that as our inspiration, and then we started doing colored artwork – paintings, drawings, some storyboards for shot ideas – and pitched those to [visual effects supervisor] Dennis Muren and numerous other people. It became this whole world of miniatures, and it was also on the brink of the digital component coming into the workflow. There’s a real marriage of practical effects, which I will still say is the most fun to work on, with the digital component.”

Envisioning new worlds still requires references that ground them in reality. For The Phantom Menace, Nakabayashi saw a dry South Dakota riverbed as a perfect reference for the bottom of Naboo’s oceans, proposing a fresh take on how to approach the Gungan city to Dennis Muren. “I go, ‘What about a booming shot? You track over and dip down to see the top of the city as opposed to always looking up. We’re going underwater, right?’” Such insight and inspiration impressed director George Lucas. Nakabayashi touches on the Gungan shield that comes down on the battlefield, continuing, “I had this idea – it was a parasol and an umbrella, kind of like a sprinkler. George loved it.”

Concept art for Men in Black II (2002) by David Nakabayashi (Credit: Sony & ILM).

Turning the Tide

As is often the case with the work ILM tackles, changes manifested for the art department over the years. Nakabayashi indicates Adobe Photoshop – the editing software co-created by John Knoll, ILM’s current executive creative director and senior visual effects supervisor – as technology that revolutionized his field. ILM even dabbled with Photoshop during its earliest days. “With Death Becomes Her (1992), Doug Chiang took plates and drew the effect of Madeline Ashton [Meryl Streep] having a broken neck. He took pictures of people and we altered them into these effects-type things.”

Along with Photoshop’s availability, concept artists continued drawing with traditional tools like pencils, markers, and paper until ILM received the call for Pirates of the Caribbean: The Curse of the Black Pearl (2003). “The director, Gore Verbinski, was like, ‘These are great drawings, but I want to see what it looks like in my film. Don’t give me a pencil sketch,’” Nakabayashi says. The filmmakers wanted a desiccated – but not bloody or gory – aesthetic for the cursed pirates, so Aaron McBride test-photographed beef, turkey, and salmon jerky.

“The turkey jerky felt the best because it scattered a slightly lighter color and was the closest to the right muscle striation texture,” muses McBride, who credits his shyness at speaking up in dailies for the process, laughing, “I pushed to do the concept art as photo-realistically as possible mostly because I wanted to be able to point to the art and not have to say anything.” As Nakabayashi explains, “Aaron took a plastic skull, a bunch of costumes, and turkey jerky, scanned it, and put all these textures on the face. This gave Gore a direction for his movie, and it was a huge moment. Everybody was trying to copy Aaron after that. We still drew and did other traditional methods, but all of a sudden, everything had to be photoreal.”

Concept art for Pirates of the Caribbean: Curse of the Black Pearl (2003) by Aaron McBride (Credit: Disney & ILM).

Another Dimension

“Getting photoreal is a lot easier now with three-dimensional tools,” Nakabayashi adds. “A quick sketch might happen, but a lot of our artists are excellent at building and designing 3D packages. It’s a great transition point from concept art to visual effects work, because of the digital assets.” 

As an intern, one of McBride’s first jobs actually involved developing photos taken for The Mummy (1999) and scanning them into a computer to be painted. He later experimented with 3D on Star Wars: Revenge of the Sith (2005), and then leaned into it while working on the suit-up machine in Iron Man (2008). “I didn’t have the skill set to do mechanical drawings, so I blocked it out in 3D to figure out how some of the panels and other mechanical things moved,” chronicles McBride. For The Avengers (2012), McBride designed a snake-like creature that dropped soldiers into the Battle of New York. Inspired by the Greco-Roman aesthetic crafted by the Marvel art department, McBride envisioned the troop transport “as a Roman galleon, almost like a biomechanical being, which had fins that were like oars.”

“We have a couple artists in the department who are sort of hybrid artists,” remarks Nakabayashi. “They do 3D, 3D animation, compositing, and things like motion graphics. Sometimes, we want to bring a flat, still drawing to life, and you’ll do a quick projection. Making something move is a huge component for success in your pitch meetings. The animatics these days are so good and so accurate that you can’t deny the distinction. They’re more productive than storyboards.”

Concept art for Indiana Jones and the Kingdom of the Crystal Skull (2008) by David Nakabayashi (Credit: Lucasfilm & ILM).

A Generational Shift

Having contributed to a variety of ILM projects as an art assistant, Chelsea Castro is now making her mark on the next wave of ILM shows as a junior concept artist at the Vancouver studio. Castro, who finds inspiration for her art in everything from books to video game soundtracks, strikes a balance between traditional methods and cutting-edge techniques. “Coming from a 2D background, I tend to sketch as much as I can, and then move on to photobashing [combining photographic and CG elements together into a new piece of art] and texturing,” Castro shares. “Afterwards, I fully build out as much as I can in the 3D software that I’m using. Then I go back into 2D to tweak everything and finalize them.”

Castro sees a fundamental evolution when it comes to storyboarding, explaining, “I feel it has gotten a lot more immersive. We still do the classic line art, but now that you can build whole 3D worlds, I’ve seen storyboards done completely in 3D. Sometimes the artists take their scenes and show it to the client with different cameras set up, like it’s their own film set. The game has changed, but the spirit of it is very much the same,” Castro concludes. “Brainstorming and getting ideas out is great with the new technologies. The refinement is where you fall back on your foundations, techniques, and the skills that you’ve built up.”

Cody Gramstad adds, “3D gets you closer to real-world accuracy. Inherently, as we’re hand-painting things, we have a tendency to make artistic cheats. It’s not necessarily a bad thing in illustration, as we can push that to enhance emotion. But, especially in a live-action context, reality is what makes the world believable – 3D is very useful for that because it takes calculations from the real world. For example, how lighting actually bounces off of different surfaces.”

Bimpe Alliu from ILM’s London studio observes that increased accessibility to 3D software among young people is as vital as the technology itself. “I’ve mentored teenage students who are learning 3D, picking up software like Blender, and learning to model and sculpt,” Alliu details. “Regardless of the gradual transition from hand-drawn paper storyboards to digital storyboards, as well as individual artists’ preferences for 2D or 3D drawing, a combination of those skills are always used to do the work to the best of your abilities,” Alliu asserts. “More people are using different techniques in order to bring together their storyboards. It’s harmonious.”

Concept art for a company holiday card by Chelsea Castro (Credit: ILM).

A Global Approach

As the ILM Art Department’s creative director, Nakabayashi embraces the modern tools bringing our world closer together as he oversees and collaborates with artists at ILM’s international studios. “I’m very much hands off, and I let the artists do their job,” opines Nakabayashi, who jokes, “When something goes wrong, I get called upon.”

For Cody Gramstad, being an art director in ILM’s Sydney studio means handling multiple shows simultaneously. “I meet with visual effects supes and give guidance for the shot sequences and how they’re progressing, and at the same time, provide feedback to the Sydney art department team to guarantee they are targeting the supervisors’ and directors’ goals.” Gramstad points out that the process is often a worldwide effort, regularly involving colleagues at ILM’s other global studios. “We support each other and make sure that we’re getting the work done at the level we need to. Nak and [director of art and development] Jennifer Coronado make certain that standards are equal across the different studios.”

However, informal conversations are just as productive. “There are a lot of art posts and chats. Keeping people inspired becomes really important, and it’s great to have artists around you that can contribute to that. Sometimes, we do design competitions, too,” Gramstad proclaims. “The art directors also sit with the artists every couple months. We break down where we can improve and how to adapt our approach as we move forward on future tasks. There are so many different shows across the world, so they’re all learning different lessons. It takes direct communication to make sure those lessons get spread to all of our studios.”

Concept art for Lilo & Stitch (2025) by Cody Gramstad (Credit: Disney & ILM).

Timelines and Tasks

With video calls continuing to bring our world closer together, ILM’s concept artists are able to communicate with clients and take on projects across multiple continents while working from their respective spaces at ILM’s global studios. This ability allows artists to be flexible in terms of their involvement on any given series or film. “Sometimes, we can be on a show for a day,” says Bimpe Alliu, who estimates that the longest time she spent on a project was her two-year tenure on The Marvels (2023).

Similarly, the timeline is naturally impacted by the stage at which the artists are brought on. A fan of anime who started out by drawing her friends as Dragonball Z characters in her youth, Alliu elaborates on the depth of her tasks, advising, “It can be pre- or post-production. We can be working on plates or creating assets for ourselves. With a recent character design, I was given the previs model and a scan of the actor, so I took those, mishmashed them together, and then detailed the clothes on top of that.”

Watching Iron Man inspired Alliu to pursue her current career, so working on WandaVision (2021) was a full-circle moment for the self-described “massive nerd.” “For the sequence where The Vision is disintegrating, I was designing what the disintegration effects would look like. Not just the space, but also The Vision himself,” Alliu recounts. On ABBA Voyage (2022), Alliu was brought on so early that she “was designing what the room where ABBA themselves would be recording and filming all their motion capture stuff would look like. I even designed baby dragons for a TV show called Lovely Little Farm (2022). They made them into little maquettes, so that was the first time that anything I designed got made physically.”

Concept art for Ant-Man and the Wasp: Quantumania (2023) by Bimpe Alliu (Credit: Marvel & ILM).

A Legacy Earned from Lessons Learned

Despite all the changes that have transpired with concept art and storyboarding over the last half-century, ILM’s history and prestige set it apart as it moved into the present and looks to the future. “ILM has a support structure and legacy that a lot of other studios don’t have. ILM can nudge newer people in the right direction as they learn the lessons that their predecessors have discovered in the past,” Gramstad reveals. “There’s also the sheer amount of variety at a place like ILM. Since so many film studios come to ILM as a source of visual effects experience, we get a huge range of projects. So, more so than any other studio in the world, I think that ILM allows people to be super versatile. One morning, we’ll be working on animated silliness with Sonic the Hedgehog, and two hours later, we’re doing a grounded oil rig on an ocean that has to be absolutely photorealistic.”

ILM’s academic aura benefits its up-and-coming and veteran personnel equally, as Bimpe Alliu resolves, “You don’t have to be the finished article. You’re going to constantly grow, and ILM is always looking for potential. It helps when you’re around people that you can learn from.” Chelsea Castro beams, “At ILM, you feel so included, and everyone shares their time with you. It’s amazing to have access to all these people around the world.” Aaron McBride, who has been with ILM for 27 years, praises ILM’s multi-generational nature for making him a more well-rounded artist, concluding, “New techniques can inform older ones, and older techniques can inform new ones. I’m inspired by what younger artists are doing, and I think it’s important not to dismiss any aesthetic because it’s new to you.”

Concept art for The Sandman (2022-25) by Bimpe Aliu (Credit: Netflix & ILM).

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

The multi-sensory experience is on view in London through February 22, 2026.

Industrial Light & Magic has partnered with Somerset House in the United Kingdom for the major new exhibition, Wayne McGregor: Infinite Bodies. The series of multi-sensory choreographic installations, performances, and experiments were directed by acclaimed choreographer Sir Wayne McGregor, and invite visitors to experience the intersections of dance, art, and technology.

Wayne McGregor: Infinite Bodies at Somerset House is an ambitious exhibition that explores intersections of the body, movement, and technology,” explained Dr. Cliff Lauson, director of exhibitions at Somerset House. “We’re delighted that the exhibition has provided the opportunity for Wayne to collaborate once again with the ILM team on the new commission OMNI (2025). This large-scale, screen-based film is the first work visitors see in the exhibition and is a spectacular introduction to beautifully-rendered dance presented at life-size. We’re deeply grateful for the creative talent and technical expertise that ILM has brought to this landmark artistic project.”

A final render created by the ILM team.


“Wayne came to us with a challenge,” explained ILM visual effects supervisor Matt Rank. “We had previously worked with Wayne on ABBA Voyage, and he wanted an opening piece for his new exhibition.  Initially, that was all the brief we had, so we set about researching Wayne’s previous work, the history of Somerset House, and exploring how we could take digitised human movement, and bring it to life as part of his exhibition.”

Rank further explains that the concept of “looking at bodies internally” became the focus of their presentation. “It’s something that Wayne’s traditional work is unable to visualize, we thought it would be fascinating to start blending the different forms that our internal skeleton, nervous system and capillaries took, and how they could appear when formed from less naturalistic materials, and lit from different angles.”

The equally dazzling and thought-provoking results are now open for visitors to see from October 30, 2025 to February 22, 2026. Head over to SomersetHouse.org.uk to learn more, and watch ILM.com for more behind-the-scenes insights about Infinite Bodies.

A final render created by the ILM team.

The ILM creative director and Jurassic‘s production visual effects supervisor talks dinosaurs and collaborating with Gareth Edwards.

By Mark Newbold

(Credit: ILM & Universal).

Jurassic World Rebirth (2025) has grabbed global audiences by the hand and pulled them back into the savage world of the Jurassic film series, three years after Jurassic World: Dominion (2022) completed the second Jurassic trilogy. Rebirth has also changed the direction of the franchise, focusing on the genetic heritage of the incredible dinosaur creations.

Taking place on the fictional island of Île Saint-Hubert in the Atlantic Ocean, Rebirth shows the terrifying cost of unchecked genetic manipulation as we meet familiar creatures, including the armored Ankylosaurus, the chicken-sized Compsognathus, the crested, acid-spitting Dilophosaurus, the aquatic Mosasaurus, the F-16-sized Quetzalcoatlus, and, of course, Tyrannosaurus rex (albeit a much beefier one than the classic Rexy).

Along with these classic creatures are new franchise stars Scarlett Johansson as Zora Bennett, Mahershala Ali as Duncan Kincaid, Jonathan Bailey as Dr. Henry Loomis, and Rupert Friend as Martin Krebs. Together, they encounter distinctly unfamiliar dinosaurs, including the enormous Distortus rex, the towering Titanosaurus, and the horrifying Mutadon. It’s a one-way trip for anyone visiting the island – but in the capable hands of director Gareth Edwards (Rogue One: A Star Wars Story [2016] and The Creator [2023]) and visual effects supervisor David Vickery (Jurassic World: Fallen Kingdom [2018], Jurassic World: Dominion, and Mission: Impossible – Rogue Nation [2015]), it’s exactly what was needed for a pulse-pounding adventure in the grand Jurassic style.

Director Gareth Edwards, a frequent ILM collaborator (Credit: Universal).

ILM.com had the chance to sit down with David Vickery to discuss the visual effects of Jurassic World Rebirth. We started by looking at how the visual effects field is viewed today compared to a decade ago, when Jurassic World (2015) broke box office records and made the world stare in awe at dinosaurs all over again.

“The visual effects industry has grown in many ways,” says Vickery. “There’s a lot more trust placed in us than there used to be, even 10 years ago. Back then, visual effects were seen as something of a necessary evil – but now we find ourselves much more readily accepted as a department on set. Nowadays, other departments – whether it’s hair, makeup, costumes, special effects, stunts – rely on visual effects to guide how to film something because they know how vital it is that the visual effects work properly at the end of the process. They want to make sure what they’re doing is conducive to how we’re going to work, and it always used to be the other way around, so that’s been a refreshing change.”

On the surface, the art of visual effects may appear to be made up of equal parts skill, ingenuity, knowledge, and creativity, but the field also requires a healthy dose of collaboration, as Vickery explains.

“Over the years, I’ve found myself not trying to figure out how to do visual effects but rather figure out how not to do visual effects, and, as best I can, enable people on set to get what they need. You rely on the expertise of the crew. The camera operators and special effects technicians might have been in the industry for 30 years, and at ILM, we’ve got a bunch of talented artists that are generalists by nature, so that level of trust in visual effects has definitely grown.

“There’s a narrative in the press about how everything is done in-camera,” Vickery adds. “Well, yeah, everything is shot in camera because you can’t ‘shoot’ visual effects. What you’re trying to capture is as many practical things on set as you can because you can’t go back and get it in post-production.”

(Credit: ILM & Universal).

A veteran of three Jurassic adventures (three and a half if you include the 2019 short, Battle at Big Rock, directed by Colin Trevorrow), Vickery has worked with three directors (Trevorrow, J.A. Bayona on Fallen Kingdom, and Edwards), and that means differing styles and methods in bringing the dinosaurs to life.

“I find it interesting, the experiences I have with crews, directors, and producers who want to make their films in different ways,” he explains. “Colin relied heavily on animatronics for his films [Jurassic World and Jurassic World Dominion]. Gareth is much more comfortable with visual effects and wanted a consistency in his aesthetic by relying on effects for all the creatures and dinosaurs. On top of that, there’s a layer of what’s fashionable in movies at the moment. For a while, it was very ‘in’ to be shooting on green screen, or it was fashionable to use animatronics, and that’s what the public wanted to see. Now there’s a desire to see things filmed on location, and there’s an acceptance of visual effects, so filmmakers respond to that in the way they make their films. It’s interesting to see the evolution in how things are done.”

The style and flair shown by Gareth Edwards in his previous films – and his obvious affection for giants, as evidenced by Godzilla (2014) and Monsters (2010) – led producer Frank Marshall and executive producer Steven Spielberg to offer him Rebirth’s directorial seat in early 2024. And with that came a rare skill set for a major franchise film: a vast working knowledge and understanding of visual effects.

“Gareth’s a very distinctive filmmaker,” explains Vickery. “He comes from a visual effects background, so he truly understands how things work. He’s the type of filmmaker that creatively evolves his thought process as a project develops, so he’s totally happy to change his approach and defer some decision-making to later down the line. Visual effects is a great opportunity for him to do that, but he also likes reacting to natural things that happen on the day.”

(Credit: ILM & Universal).

Along with his knowledge of effects, Edwards is also known for his guerilla filmmaking style, something Vickery would learn more about from an Oscar-winning special effects supervisor.

“We worked with Neil Corbould [special effects supervisor for 2000’s Gladiator and Edwards’s The Creator, among others] on Rebirth,” Vickery says, “and I spoke to Neil beforehand because I was trying to find out what Gareth is like, and he said Gareth shoots really long takes. I’d heard of 20-minute or 30-minute takes on Rogue One, but how do you plan for that? How do you rig special effects knowing that Gareth’s going to roll for 20 minutes? Neil said they put a load of stuff out there – loads of mortars, loads of pots, loads of bangs, loads of fires and squibs, and they fired them off. They gave him something to work with, and Gareth reacted to that. Gareth wants to be in the real world, to react to what’s in front of him, and then capture the best version of that.”

That drive to find the best moments, to allow the actors to add their own essence to their characters, and to rarely say “cut,” extends to the visual effects realm as well, where Vickery found Edwards was every bit as open to allowing ILM to find those moments.

“He’s very open with his creative briefs and gives ILM a lot of creative flexibility in how to work,” says Vickery. “He doesn’t look at something and say, ‘What’s wrong with this?’ At ILM, we look at something and try to understand how we can make it better, and I think that’s why we stand out in the visual effects field. We’re trying to figure out how things can be made better, and Gareth does the same. He looks at something and in his head it’s a 7 out of 10, but what do we have to do to make it a 10 out of 10? What do we have to do to make it an 11 out of 10? He’s always going to wonder what would happen if we pushed it a little bit more. Does it break, or is it better? He doesn’t want to leave any creative opportunities on the table.”

Edwards’s naturalistic style formed the bedrock of the film, giving Vickery and the ILM team an opportunity to do things differently, opting for realistic substance over easier, stylistic options.

“Gareth said early on that he never wanted to get into a situation where the dinosaur walks into shot, strikes a pose, and roars. That feels staged,” Vickery says. “When you photograph animals in nature, they do whatever they want, so there were a few golden rules that he gave us. We should never animate a dinosaur unless we had reference of a real animal to use. It didn’t have to be doing exactly what the dinosaur was doing in that moment because you can’t find real dinosaur animation reference, but it could be something like a large animal looking scared or startled. Gareth said if we do that, he wouldn’t question whether the animation or the intent of the performance was correct, so he was very good at not micromanaging.”

(Credit: ILM & Universal).

Another advantage of Edwards’s understanding of the effects tool kit was that it gave Vickery and his team a framework to build the film around before the work began.

“He would construct an edit for us, but because he understands visual effects, he also understands the possibilities of what the shots can be,” notes Vickery. “We often found that the first time you watched them, it was difficult to understand how he wanted the performances of the creatures to play out, but you started to work on it and put it together, and suddenly we were like, ‘Oh yeah, that really works, the timing here’s really good.’ That’s how his visual effects background plays to his strengths because he can see the finish line much more clearly in his head than most other creatives.”

With all the advantages of a director understanding one of the key elements of the production, the process of building the film forged on.

“The first process we go through is laying the shots out and blocking in very basic key frame animation,” says Vickery. “That process takes a long time because it’s all about getting the composition of the image correct. In post-production, 60% or 70% of our time was spent on layout and animation, and then the rest – composites and lighting – was relatively quick. Gareth’s a great cinematographer in his own right, so he’s able to see when the composition of an image works well, and then ILM takes it from there.”

The presence of a T. rex is a Jurassic tradition, dating right back to the 1993 original and through every iteration since. While the Tyrannosaur isn’t always the “star” of the film – as in Joe Johnston’s Jurassic Park III [2001], which introduced Rebirth star, the Spinosaurus, or Jurassic World’s Indominus rex, Fallen Kingdom’s Indoraptor, and Dominion’s Giganotosaurus – the queen of the lizards remains ever-present. With Rexy, the original T .rex from the first six films not present in Rebirth, her starring role went to a new, even more terrifying Tyrannosaur. The new star appears in a sequence inspired by a scene in Michael Crichton’s original 1990 Jurassic Park novel where Alan Grant and John Hammond’s grandchildren, Tim and Lex Murphy, raft their way back to the main complex on Isla Nublar. It presented more than a few challenges.

(Credit: ILM & Universal).

“The T. rex boat inflation scene was an idea Gareth had really early on,” Vickery says. “We looked at it thinking, ‘How do you inflate a raft, have it pop up, and then like a magic trick, make the T. rex vanish?’ On the day, we had a special effects raft that inflated, but it didn’t fit into the bag, so it was laid out and they popped it up, and it inflated and fell into the water, but it didn’t stand up on its end. It took much longer to inflate, so that was always designed to be a reference for us, and then our effects artists worked on that and created this wonderful piece of dynamic simulation that had to be choreographed as a piece of animation to feel naturalistic, slowly dropping into the water. We spent a long time on the simulation of the raft. As a supervisor, that was a thing of absolute awe-inspiring wonder for me. The artist that worked on that simulation did such an amazing job because it’s an incredibly complex piece of dynamics and timing. We ended up with a subtle piece of animation where, just as the raft is inflating, you start to see the T. rex moving and start to get up.”

Bringing this adaptation of a classic Jurassic scene from the pages of the novel to the big screen required some ingenious thinking, especially given the presence of a sleeping T. rex and a noisy inflating raft.

“We played around with the T. rex a number of times because we had to walk a very thin line,” notes Vickery. “If you thought the T. rex was awake, you’d wonder why it didn’t see the girl and eat her, so it had to look like it was asleep, but not so fast asleep that it wouldn’t have the ability to wake up and move off.” Timing was everything. This newer version of the T. rex wasn’t as simple as reskinning the existing T. rex asset; this required a completely new “build” taking into account the larger frame and bulk of this Tyrannosaur (to say nothing of its ability to swim). Vickery has nothing but praise for the team that worked on the project.

“The creature department has a brilliant understanding of anatomy; they could be biologists. We had a modeler on Jurassic World: Dominion who trained as a palaeontologist at university, but it’s more than just figuring out how its muscles should jiggle and how its skin should wrinkle. That gets you to the equivalent of a shop mannequin version of a dinosaur. The challenge is to imbue character into the creature, so it feels unique amongst its own species.” 

“Gareth would say, ‘I want to see 100 T. rex’s perform and choose the best one,’” Vickery continues. “He wanted the Robert De Niro version, not the shop mannequin, so how do we imbue that kind of character into it? Part of it was to go back to the animation reference, so you really understood the creature’s intent, and part of it was making sure it’s performing in a naturalistic way.

“Gareth explained how, if you block a sequence with an actor and the brief is you come in the door and you sit down at the desk and you pick up the pen, then the person who’s the stand-in for the day will walk in the door and sit at the desk and pick up the pen,” Vickery adds. “But when the actor comes in, they’ll walk in the door and they’ll give him a mean look and they give it the De Niro treatment and you get a real performance. We always look for that level of performance, and that goes all the way back down to the anatomy of the creature. Do you know where its muscles are firing to give tension in the neck or in the legs? Gareth was interested in things that a creature would have that weren’t preserved in the fossil record, so that gave us creative licence to add extra fat layers or muscles, or waddle under the neck or flaps of skin in different places that would help give it character, which the shop mannequin version wouldn’t have had.”

(Credit: ILM & Universal).

In Jurassic World Rebirth, there’s an extra layer to the story of the dinosaurs; alongside the “classics,” there are genetic mutations, creatures created while striving to find the perfect mix of DNA – both biologically ancient and contemporary – to create the attractions demanded in the parks. As Vickery explains, these creatures were never intended to be seen by the public.

“It’s not like Jurassic World, where they were trying to create attractions for the park. These are failed experiments to create truer genetic dinosaurs. Before they figured out the exact strands needed to get a Velociraptor, they didn’t get the combination right, so that’s how we got the Mutadon. You’re supposed to feel a bit sorry for these creatures. It’s like Sloth in The Goonies [1985], initially terrifying, but then you feel really sad for him by the end of the film, and he becomes a hero.” 

Vickery adds with a smile, “I’m not sure you feel that sad for the D. rex, but you do understand that it’s biologically limited. It’s got this huge encumbrance on its head. It’s heavy and weighty, and that means it can’t run really fast.”

In addition to the raft and the T. rex, there are plenty of other visual effects in the river sequence. “When visual effects are successful, people don’t notice them,” says Vickery. “The raft, the grass, the tree, and the land mass that the T. rex was on were entirely digital; it wasn’t shot on location. The thing is, no one’s going to look at it and go, ‘That was a visual effect,’ so it doesn’t get the credit it deserves, and that’s something I think about a lot. When you’re so close to a film and you’re working on every single component of it, you inherently know what’s a visual effect and what’s not. You hope to get to a place where people don’t realize what a visual effect is, but you’ll never fool them with a T. rex, right? It’s hard to know where to place your emphasis when you’re discussing or promoting work, and it’s hard to know where to draw people’s attention because I don’t know what you or the person sitting next to you understand to be visual effects or not.”

Another unseen visual effect is water, of which there is plenty in Jurassic World Rebirth. From the Mosasaurus attacking Duncan Kincaid’s boat to the T. rex attack on the Delgado family (Manuel Garcia-Rulfo as Reuben, Luna Blaise as Teresa, Audrina Miranda as Isabella, and David Iacono as Luna’s boyfriend, Xavier Dobbs), water is a constant presence in the film. And the marriage between the real-world filming locations and the pixels of ILM required some heavy-duty work to succeed, as Vickery explains.

(Credit: ILM & Universal).

“We spent six weeks in Malta shooting the boat sequence, and the cast were on the water for two days of those six weeks. I took a drone out with the second unit and did aerial photography for another three or four days, so probably 85% of the 300 shots in that sequence were shot on dry land. There are very few shots which don’t have some element of digital water in them, even the shots that were filmed at sea. Perhaps it’s a little more obvious when there’s a huge dinosaur thrashing around in it. But the fact that only 10% to 15% of those shots have real water in them is another thing that audiences may take for granted when they’re watching the film.

“Our effects department, led by CG supervisor Miguel Perez Senent, started development work on the water simulations when we were still in pre-production,” Vickery continues, “so we had a good six months run up to it because there are 50 pages of script that take place on the ocean. So we always expected that to be the biggest technical challenge we had on the entire film.” 

That massive undertaking required new solutions to work. “We built new water solvers in Houdini [3D visual effects software] to help with the white water, the spray, and the secondary and tertiary splashes as the creatures break out of the water, but it was a massive data management issue because the simulations were throwing huge amounts of data around. One of the sims had over 5 billion points of white water spray and splash, so Miguel developed some really clever techniques and tools to help us identify and break those simulations up into regions to make the caches and the sims more manageable.”

The technical aspects of the process are groundbreaking, exactly what audiences and followers of ILM have come to expect over half a century of innovation, but the glue that seals the effects to the physical action requires an artist’s touch.

“Beyond the technical side of it, there’s the visual artistry of being able to blend and match the look of water in Thailand, where it’s slightly greenish-tinted water where you can see through to the rocks and the coral beneath the surface, or the slightly deeper, bluer waters of Malta,” explains Vickery. “And then being able to make sure we’re matching all the different lighting conditions that we had throughout the time we were shooting in Thailand and Malta, on the tank, and on the stage, and all the while trying to live up to Gareth’s standards of cinematography and lighting.

“We had John Mathieson on the show, arguably one of the finest cinematographers alive today,” Vickery adds, “so we’re trying to match our work to the best in the world, whether it’s cinematography or special effects. The artists at ILM stand toe-to-toe with all of those departments.”

(Credit: ILM & Universal).

To evoke the look and feel of the 1993 original, Gareth Edwards chose to film Jurassic World Rebirth on 35mm film using Panaflex Millennium XL2 cameras and vintage C and E Series anamorphic lenses from Panavision, closely following the equipment used by Steven Spielberg over 30 years ago. In days past, such a decision could have caused issues, but decades into the digital age, Edwards’s choice was purely aesthetic.

“I’ve gone back and forth between digital and film with the projects that I’ve worked on at ILM and elsewhere,” says Vickery. “Jurassic World: Fallen Kingdom was fully digital, Avatar: The Way of Water [2022] was digital, Mission Impossible – Rogue Nation [2015] was shot on film, and this Jurassic was shot on film, so I don’t really have any skin in the game as to which one I prefer because it’s the difference between painting on a wall or painting on a canvas. There’s a texture that’s unique to film, which I really enjoy. There’s the grain, the emulsion, the chromatic aberration, the distortion, the shallow depth of field. It helps you bed the visual effects into something that feels real. Gareth wanted the aesthetic of the original Jurassic Park, not the narrative or the characters, just the aesthetic. He wanted it to be as if Universal had gone into their film archives and found something they shot 30 years ago. It felt quite nostalgic at times.”

With a career spanning well over two decades and a role as creative director of ILM’s Mumbai studio, one could be forgiven for thinking David Vickery knows all there is to know in his field. But in an arena built on innovation and creativity, he continues to learn from others and add that knowledge to his own, including from Rebirth’s director.“Gareth would say, ‘Don’t be afraid to try new things,’” the visual effects artist concludes. “When we started designing these creatures, his remit was to do little pencil sketches, so if you give him 13 ideas for a dinosaur, he’d be upset if seven of them weren’t so entirely stupid that we couldn’t use them because we hadn’t pushed the envelope far enough. He didn’t want 13 really safe ideas because we would look back and think, ‘What if we’d pushed it a bit harder?’ It’s much easier to dial back something crazy and make it truly excellent than it is to force something average to be ‘good enough.’ The flip side of that is oftentimes on a film you’ll come up with an idea and you push it harder and harder, and you try and try, but it doesn’t work, so it’s also about knowing when you should tear it up and start again. I feel like I learned a lot from Gareth.”

Vickery pauses for a moment. “He did an amazing job on Rogue One, so if he ever does another Star Wars, I’m in.”

(Credit: ILM & Universal).

Jurassic World Rebirth is available to stream on Peacock beginning October 30, 2025.

Mark Newbold has contributed to Star Wars Insider magazine since 2006, is a 4-time Star Wars Celebration stage host, avid podcaster, and the Editor-in-Chief of FanthaTracks.com. Online since 1996. You can find this Hoopy frood online @Prefect_Timing.

ILM.com is showcasing artwork specially chosen by members of the ILM Art Department. In this installment of a continuing series, three artists from the San Francisco and Vancouver studios share insights about their work on the 2025 Netflix production, The Electric State.

Supervising Art Director Fred Palacio

The primary goal for this piece was to expand the universe of the book by introducing a new character type of “humanoid drones.” A secondary objective was to give the drone a distinct identity — customizable much like how we personalize our phones. 

We started with some existing designs but needed to push further to keep the spirit of author Simon Stalenhag. Take Wolfe, for example: it features numerous subtle details and paint choices that lean into a biker aesthetic, ranging from purely decorative elements to practical tools. Its unique color scheme further pushes this visual language. In contrast, the Security drone is designed to feel more industrial and less customizable. One of the most challenging aspects was the head design, which needed to resonate with the book’s visual style. 

Designs should always speak to the client and audience alike, while staying true to a consistent visual language. The design must strike a balance between distinctiveness — so the character can be recognizable in any light situation — and coherence with the established universe. Struggle is the challenge, which we love. Our job is to embrace it and transform into an achievement, where a piece concept becomes a living character on screen.

Senior Art Director Alex Jaeger

These were the outcast Scav bots. They were also an attempt to add back the more creepy and somber tone that Simon Stalenhag’s artwork contained. The goal for these was to design a series of bots that were “collectors.” They each had a theme and that theme is what drove the design and personality of each one.

The process for creating these was to first block out some overall shapes for the proportions, so that they would read in the shadows since these would mostly be seen at night. Second was to block them out in 3D and begin the themes of each, adding more complexity with each pass. Then came the balance of tone. I created cleaner, brighter versions, then dirty and desaturated versions, then darker, moody versions. The end product was a mix with the colors only coming through brighter around the heads to draw the eye, and the rest faded off. There were also several passes for the amount of cables and adjustment of scale. This final version was the midway level of cables and adjusted scale.

When it came time to do the dirty textures I found some great inspiration from abandoned theme parks. To see how some painted graphics faded while others almost got better with age and gathered great patinas even over brighter colors. 

Senior Concept Artist Kouji Tajima

They are two robots from the mall, named Tacobot and Pianobot. They were probably robots that worked in the mall’s restaurants or music stores. The biggest challenge was figuring out how to get the style as close as possible to Simon Stalenhag’s. To do that, I studied the kinds of parts and cables he uses in his paintings.

When I was designing Tacobot, I studied the textures of a lot of advertising figures and signs from real taco and hamburger shops. Because of that, I gave the taco ingredients on its body — like the vegetables — a painted look instead of a realistic texture.

Normally, I don’t do any sketches. I go straight into 3D software and develop a few different ideas. In addition to this version, there was also one with a face and a mustache in the middle of the taco!

See the complete gallery of concept art from The Electric State here on ILM.com.

Learn more about the ILM Art Department.

Watch The Electric State on Netflix.

Drew Struzan’s art became part of filmmaking mythology, from Star Wars posters to ILM’s classic emblem.

Drew Struzan distilled movie magic into a single iconic image, often the one audiences saw first. The celebrated creator of hundreds of one-sheet movie posters that blended classic portraiture with cinematic montage passed away October 13 at age 78.

For decades, the renowned artist and Industrial Light & Magic moved in the same creative orbit — ILM conjuring the visual effects that brought cinematic worlds to life, and Struzan capturing their spirit in paint, in turn inspiring generations of ILM artists.

Struzan was instrumental in helping define ILM’s early identity. Working from a design by ILM matte painter Michael Pangrazio, Struzan hand-painted the company’s first logo, depicting a tuxedoed magician conjuring a spark of light, framed by a large gear bearing the letters “ILM”. Struzan’s painting was a perfect visual metaphor for what ILM represented: the fusion of artistry and technology, imagination, and precision.

The font style used in ILM’s current logo, revealed in 2023, was closely inspired by the typography in Struzan’s painting.

Struzan’s logo became shorthand for creative excellence. It appeared on letterhead, production slates, and crew gear, signaling that the work within carried ILM’s signature blend of craft and wonder.


His association with Lucasfilm began in 1978, when he collaborated with Charles White III on a special re-release poster for Star Wars: A New Hope. The “circus-style” artwork — with its layered texture and weathered look — became a favorite among fans and collectors, launching Struzan into a long association with Lucasfilm and, by extension, ILM.

Over the next three decades, Struzan’s brush defined the visual identity of films that spanned the Star Wars and Indiana Jones series, Back to the Future, E.T. the Extra Terrestrial, and many more. His posters for Star Wars entries The Phantom Menace, Attack of the Clones, and Revenge of the Sith threaded prophecy and tragedy through glowing color and emotional expression. Each piece was created by hand: sketches on gessoed boards, acrylics and airbrush for texture, and Prismacolor pencils for final detail.

In the same way his movie posters gave films an emotional face, Struzan’s ILM logo gave the company one — a timeless emblem for the artists who turned imagination into illusion.

Read more at Lucasfilm.com.

In a special video, the senior compositor examines moments from a number of iconic films.

Industrial Light & Magic senior compositor Todd Vaziri recently joined Vanity Fair to break down classic visual effects shots from productions including Rogue One: A Star Wars Story (2016), Dungeons and Dragons: Honor Among Thieves (2023), Star Wars: Skeleton Crew (2024), Star Trek Into Darkness (2013), Transformers (2007), and Star Wars: The Force Awakens (2015).

“There’s a perception out there that digital effects are a black box, that it just gets shipped off and the directors are just handed this work,” Vaziri comments at one point in the video. “[That] couldn’t be further from the truth. We work directly with filmmakers to achieve their vision.”

Watch the full video at this link or above. And hear more from Todd Vaziri on Lighter Darker: The ILM Podcast.

Read more about Beyond Victory on StarWars.com.

ILM’s new Mixed Reality Playset is available on Meta Quest 3 and 3S headsets.

From the team that brought you Vader Immortal and Tales from the Galaxy’s Edge comes ILM’s next bold chapter in interactive Star Wars storytelling: Star Wars: Beyond Victory – A Mixed Reality Playset. Now, you can start this adventure yourself on the Meta Quest 3 and 3S headsets.

Set during the Reign of the Empire, Beyond Victory introduces players to an original story that blends the thrilling world of podracing, a stellar cast, powerful narrative and mixed reality play. To celebrate the launch, we sat down with Beyond Victory‘s director, Jose Perez III, for an exclusive behind-the-scenes look at the creative vision, development journey and personal influences that shaped this experience.


Let’s start with the basics, for those who haven’t heard yet, what is Star Wars: Beyond Victory – A Mixed Reality Playset and what makes it different from previous Star Wars experiences?

Star Wars: Beyond Victory is a mixed reality playset that gives you three ways to experience the fun of the Star Wars galaxy. We have Adventure mode, a short story about an up-and-coming Podracer who’s struggling with grief and the desire for fame. Then we have Arcade mode, which is a really fun, replayable experience that gives you a taste of old-school arcade games with a new mixed reality (MR) twist. And finally, we have Playset mode, where you can literally bring your favorite Star Wars toys to life, scale them up, and have them interact with each other. This is our first time experimenting with mixed reality at this scale, and we wanted to mix it up, no pun intended.

Can you tell us your role in bringing this project to life?

I am the director of Star Wars: Beyond Victory. My job was to work with all the talented artists, programmers, designers, writers, and actors to help bring this experience to life. I get to wear a lot of different hats over the course of development, which keeps me really excited. I came up with the original story, and then worked with our writers and the story team to help flesh it out and give it texture. In this role, I assisted with casting, and I directed the performance capture and voiceover. I was there every day working with the designers, production designers, and artists to help shape the look of the experience and how it feels to move around in mixed reality or drive your Podracer. It’s one of the best jobs in the world, and I’m a very lucky person to have it.

Concept art by Evan Whitefield (Credit: ILM & Lucasfilm).

Piggybacking on that, could you describe how your core team at ILM is organized and how you work together in developing a production like this?

Our team is highly cross-disciplinary. The key roles include designers, engineers, and programmers. We have the entire ILM Art Department and a fantastic production team that helps us pull all of this stuff together. We also have marketing folks who join us, especially as we get towards the end of the project to help show our work to the public. I think how we develop it is where things get most interesting. We like to maintain a culture of kindness, but we also like to be honest when something isn’t working and do our best to make it better. It’s a very iterative, humbling, and egoless process. It’s a lot of really smart, intelligent people working together to try and make the best thing they can on the hardware that we have. We try to keep it honest so everyone should be able to say what they need to say, and we really focus on what’s on the screen — what is the best experience for our guests.

Where did the original idea for a mixed reality Star Wars playset come from? Was this always envisioned as an MR experience?

The original idea was actually a virtual reality (VR) playset for filmmakers to help them visualize and compose scenes. It was a tool I created with a few friends in the Advanced Development Group at Lucasfilm and ILM, for directors working on big Hollywood movies. We found that it was a simple, fun tool for them, and when mixed reality became a viable technology, we knew it could be a cool experience that would easily translate to digital action figures.

Can you tell us about why you decided to make the podracing in this experience something that’s top-down on a holotable in third-person vs. a first-person POV?

It’s not truly ‘top-down’ as much as it is a 3D diorama when you’re standing next to it. It was definitely a conscious choice to make it third-person to fit into what we were trying to do here, which is pushing the boundaries of mixed reality. Early on, we knew we wanted to lean into the “Toy” vibe of the Playset and do something unique with the technology. This approach felt like the natural way to achieve that. I’m a big fan of 1980s retro games, so for me, this was about taking those classic arcade concepts and adding a whole new dimension.


How did you balance innovation with staying true to the Star Wars legacy and canon?

I’m a huge Star Wars fan, so I love working within the established canon. Our innovation came from the way we approached storytelling. This isn’t a galaxy-spanning event; it’s a smaller, personal story. Telling a story in mixed reality is hard, and we made some big choices, like letting the camera cut to express the narrative while you’re looking at miniatures. The key was to balance this innovation with ensuring all the characters and the world fit seamlessly within the broader Star Wars galaxy.

Were there any particular Star Wars films, shows or eras that inspired the tone and style of this experience?

The main inspiration was definitely The Phantom Menace [1999] and the podracing scene. Beyond that, we used the Star Wars galaxy as a palette to tell stories that were interesting to us and that would deepen the world.

Can you explain how you landed on the three distinct modes? How do you balance development for them all since they’re all so different?

We settled on three modes to offer players a variety of experiences. Adventure mode is for those who want a guided story, while Arcade is for replayability and pure fun. Playset is the ultimate sandbox for creativity. The three modes mirror how I experienced Star Wars as a kid: I’d go see the movie (story), hit up the arcade afterward to play the latest Star Wars game, and then go home to play with my toys. This is just a way of bringing a modern version of that nostalgic experience to people today. Balancing development was a challenge because they are so different, but we approached each one as its own mini-project while maintaining a consistent visual style and user interface across all three. This allowed our teams to focus on the unique requirements of each mode without starting from scratch every time.

Concept art by Chris Voy (Credit: ILM & Lucasfilm).

Mixed reality is still new to a lot of fans. How did you approach that, especially for younger players or those new to immersive tech?

Mixed reality is definitely new, and that’s actually really exciting for us. One of the things we love to do here at ILM is really push on new technologies, so it’s a joy to work with Meta and continue to push the boundaries of mixed reality, virtual reality, and, hopefully in the future, augmented reality. For this experience, we knew we had to make it intuitive and accessible. We treat every one of our experiences like it’s the first time someone has ever put on a headset, and this was no different. Making it accessible and user-friendly is something we always come back to; we want to politely walk you through the experience and ensure it’s enjoyable in the most delightful way possible.

Was there a moment during development that made you feel, “This is it. This is Star Wars.”?

Anytime you work on a Star Wars project with ILM and Lucasfilm, you’re going to have those moments. For this one, a couple of moments stand out, especially during the voice recording sessions. Hearing Greg Proops doing the voice for Fode, or Lewis MacLeod voicing Sebulba — it felt like we were right there, talking to those characters! Those performances and the incredible vibe they brought were instantly recognizable. The score was done by Joe Trapanese and Clark Rhee, and what was so awesome about them doing it is that they brought their own unique vibe to the music. We were also able to include some of composer John Williams’ music, and when you mix that in with the new score, you get a fresh, new version of Star Wars that is still very much Star Wars. It’s very exciting.

The audio in this experience is exceptional. Can you talk about what it was like working with Skywalker Sound on Beyond Victory?

Working with the team at Skywalker Sound is always an amazing experience — I’ve been collaborating with them for over a decade now, and they just always bring the heat. They are masters of their craft. It was a true collaboration; they didn’t just provide us with sounds — they worked with us to build a rich, immersive soundscape that elevates the entire experience. They have an incredible library of assets they can pull from across all the films and animated shows. The audio they created for the Podracing alone makes the experience so much more beefy and visceral. Additionally, Kevin Bolin, who is one of the main audio supervisors, provided a lot of great suggestions and even co-directed a couple of parts of this experience. They are truly the best.

(Credit: ILM & Lucasfilm).

Speaking of sound, let’s talk about the cast of Beyond Victory. What was it like working with such a stellar cast? Were they only involved in the voiceover or did any also do motion capture work?

We are always fortunate to get an amazing cast of voice actors, and this was no different. Between Greg Proops, Fin Argus, Lewis MacLeod, as well as Lilimar Hernandez and Bobby Moynihan — we just had such a great time. One of the things that was so fun is that we actually did a full performance capture for this. The audio that you hear was captured at the same time as they were doing the mocap, so it was super fun to see Greg Proops, Bobby Moynihan, Fin Argus, and all these people in the same room collaborating to bring this to life. They all did such an amazing job and they really elevated the entire experience. 

And that’s not even counting the loop group! At Skywalker Sound, we have a loop group of great voice actors who come together to help fill in the world, doing a bunch of background voices and stormtrooper voices. They always do such an amazing job and have worked on the cartoons and films, which brings an authentic Star Wars feel because you’re hearing voices familiar from other Star Wars stories as well. Yes, we had an amazing cast.

I felt like I was reliving my childhood while playing in Playset mode. Were you a fan of the toys growing up and was the intention of Playset to bring some of that nostalgia to life?

Oh yeah, I was a huge fan of Star Wars toys growing up and I’m a huge fan of Star Wars toys now. When I was a kid, I had a bunch of different ones — I had the Ewok Village, I had the Millennium Falcon for a little bit. One of my saddest memories is when I was heading into fourth grade and I gave all my Star Wars toys to Goodwill because I thought I was too old for them, and I immediately regretted it afterwards. It feels like I’ve spent my whole life trying to rebuild that collection! So, this is probably me just tapping into some childhood trauma and trying to bring some of that back [laughs]. Today, my office is full of Star Wars toys.

This must have been a massive cross-disciplinary effort. Can you talk a bit about the collaboration between designers, engineers, writers and the teams at Lucasfilm & Skywalker Sound?

It is a real undertaking. We have a lot of really smart people with a lot of opinions, and getting everybody onto the same page and making sure that we’re working on something we are all proud of is hard to pull off, but I think we did a really good job. Our production team is a big part of that, making sure that all the different disciplines are talking and coming together for the proper meetings. It is a massive cross-disciplinary effort, not just within the people working on Beyond Victory, but you have to remember that we need to fit into the entire Star Wars galaxy. So, we have to be cognizant of all the other projects going on and make sure we fit in that world, too, without breaking canon. A lot of work goes into pulling all of this together, and a great team and production process made it all happen.

(Credit: ILM & Lucasfilm).

We absolutely love seeing Star Wars comics character Grakkus the Hutt in this. Without getting too spoiler-y, is there a moment in this experience that you think fans are especially going to love? Something that will make them pause and smile?

Hopefully there are a couple of moments like that! Grakkus the Hutt is amazing. He was awesome in the comic books, and we just knew we had to bring him into this experience — he’s just too cool. I can’t wait for people to see him in all his glory when he’s standing above you; he’s literally like 12 feet tall! But I think for me, the real moment was getting to see Sebulba in person. Watching him walk around, just seeing the creature that Sebulba is — for me, as someone who loves The Phantom Menace and the prequels so much, it was really cool. It definitely brought me a lot of nostalgia.

ILM celebrates its 50th anniversary this year, and interactive experiences have continued to play an important role in ILM’s diverse range of storytelling. How does Beyond Victory help carry this interactive legacy forward?

It’s incredible to be celebrating our 50th anniversary this year. We’ve done a lot of interactive work through Lucasfilm and ILM, but Beyond Victory marks a new step for us. We’re breaking technological ground by pioneering at this scale in mixed reality. It also, in an unusual way, echoes our early film history, when we were working with miniatures and seeing the world through that lens. As far as carrying our interactive legacy forward, I hope that people see this project as a successful push into new territory. This is the first mixed reality Star Wars project with a full, integrated experience — a cohesive story, an arcade mode, and a customizable playset. At the heart of this, like all ILM projects, is really the story, and I hope people really appreciate it and that these characters can go forward into the galaxy. We’re always trying to do something different, and we hope the community appreciates this push.

Any final message you’d like to share with ILM.com’s readers?

Thank you to the entire Star Wars and ILM fan community. We’re thrilled by the love we’ve received as we explore new realms like mixed reality. We couldn’t do this without you!

Concept art by Stephen Zavala (Credit: ILM & Lucasfilm).

Play Star Wars: Beyond Victory – A Mixed Reality Playset now on Meta Quest 3 and 3S headsets.

Learn more about Beyond Victory’s unveiling at Star Wars Celebration 2025.

ILM.com is showcasing artwork specially chosen by members of the ILM Art Department. In this installment of a continuing series, eight artists from the San Francisco, Vancouver, London, and Sydney studios share insights about their work on the 2025 Disney production, Lilo & Stitch.

Art Director Cody Gramstad

Gramstad: This concept was tackling two key problems for this scene. What should our balance be between Chris Sanders’s soriginal visual style and the limitations of live action expectations?  At the same time, we could use this image to give practical design guidance for the environment team on how to stage the podium space so that the window both frames and provides value contrast to the Grand Councilwoman’s head for our primary focal point.

The iteration process for this concept was an evolution of a pre-existing previs set. From the original film we knew key staging, camera placement, and expected lighting direction. The iterations came in adjusting the environmental elements around the figures, exploring different shape languages and materials, and experiments in color and saturation to find a balance that maintains the personality of the original animated film but could exist in the lighting and material context of a more dimensional rendering approach.

My favorite part of this piece is the simplified value shape language. When this composition is refined down to its most basic fundamental art skills, it creates a graphic shape language that feels in character to the original film, while at the same time allows for a clear read of the primary focal point.

Senior Visual Effects Art Director Alex Jaeger

Jaeger: The main brief for Lilo & Stitch was to try out some ideas to pull it away from looking like an animated feature and make it feel more realistic while keeping all the main structures from the original film the same. 

The work on this piece was done as part of a push to complete a set of shots early for the trailer. As part of the process it was also done in an effort to gain a bit more realism in this sequence and offer up some new suggestions and options for detail and lighting. So, after looking at the existing sequence, I found that the textures overall were soft and that a few indications of hard reflections might help.

One of the challenges was to not alter any of the models, but rather keep my alterations to lighting and texture. So I added more fall-off and texture to the spot lights, added a metallic line element to the platform railing and floor. I also added a more metallic glint to the threads in the banners. The hardest part was finding spots to add metallic elements that would be most effective for the added realism that the client requested, without altering major elements.

Concept Artist Mathilde Marion

Marion: After his trial, Stitch is sent to a lab room where he is being tested upon, and from where he escapes. We needed to start from the client’s design of the room, and in the same spirit, expand a workstation into a DNA reading machine. We came up with a few variations of designs and how it would work, based on the client’s storyboards. This one is, in my opinion, the most successful.

This is a frame of the overall design, but it was actually designed and sent to the client as a series of close-up shots where we can see Stitch’s hair being processed and tested by the machine. There were primarily two challenges: designing a machine in the spirit of the original animated feature, all the while showing a sequence of mechanical events that are somewhat logical. Because the movie isn’t meant to be realistic, we had a bit of leeway, but it still needed to work within the chosen design and make sense story-wise.

I took inspiration from Chris Sanders’s original designs, and the original movie’s assets and weapons. I made a goal to try and match another artist’s style, which is not the easiest thing to do. My style is usually not as cartoony, and it was important for the story that everything was sitting in the right visual universe. A stylized-type of drawing is really tough, as it requires a perfect understanding of the basic shapes, values, and color relationship. You can’t hide behind details or processing in your image. I found that very interesting and had to challenge myself.

Senior Concept Artist Brett Northcutt

Northcutt: I worked on this piece late in the schedule trying to help with lighting and reality cues to improve the look of the shot.

This shot was originally front lit against space and I thought it looked a bit flat. I reversed the lighting to make it back lit, which really helped the mother ship look more imposing. With the ship now pretty dark, adding a nebulous background really helped to make it pop and also added visual interest. Finally, adding a planet to the lower right justified adding some unusual light reflection to the dark side.

Supervising Art Director Fred Palacio

Palacio: The task was to make the character more appealing to a broader audience, while avoiding a design that may appear too frightening for some. The main challenge was the time constraint, as the character had already been modeled, textured, animated, and rendered, so any changes had to be made on the spot.

In situations like this, my approach is to assess where we are and iterate step by step through paintovers, gradually exploring the visual possibilities. For example, we might ask: what if we changed the shape of the pupil? Its size? Its color? What if the skin appeared softer, the color more uniform, or the hair density had more contrast? 

Each adjustment was aimed at subtly shifting the character toward a more stylized, graphical direction, while still preserving the realistic quality the team had already achieved. It felt almost like sending Jumba to our makeup and hairstyling department. We also explored enhancing the clothing by injecting more saturation and slightly shifting the hues to evoke the distinctive palette of the 2002 film.

Art Director Amy Beth Christenson Smith

I worked closely with senior animation supervisor Hal Hickel under a fast deadline to get final boards ready for these sequences. The location had been scouted, so I had to make sure to match the scale and layout of it all.  The most challenging part was also the best part: making sure Stitch had a lot of over-the-top personality and that the comedy would shine through.

The client shared reference for the scouted location, as well as some rough sketches for a few frames. The biggest inspiration came from the characters in the original animated movie, trying to match their body language and personality. I also took inspiration from my pet rabbit – having Stitch turn only his ears in the direction of the sound when the shop doors are opening came from how my rabbit’s ears twitch and turn when she hears any sound.

Art Director Igor Staritsin

This was an art direction shot paintover that was meant to help the visual effects team establish the look of the final shot. The main challenge for this sort of task is to make the concept as close as possible to the final quality of the shot, as if it was seen in the movie. It usually requires quite a bit of research on the subject matter, as we want to make sure that the decisions made are based on reality. For example, a good design is usually achieved not only by establishing a pleasing aesthetic look, but also a logical function. The same goes for shot paintovers. We want to play up the most important elements in the frame that help to tell the story and play down the rest.

For this shot I knew what I was going to do after gathering enough information from my prior research on the task. However, there are certainly moments when one might experience a bit of struggle when trying to find a solution. I think it is best to assess your design in the simplest way possible, meaning one shouldn’t try to go into details too soon and get lost there. It is important to make sure that big shapes read well. Proportions and distribution of shapes make for a pleasing arrangement. When the basics are in place then mindful distribution of details on top will bring the design home.

I really enjoyed adding small details, a variety of materials, and break-ups that made it all look more realistic in the end. Tiny things like halation, bloom, vignetting, and suppressing details in secondary areas, as well as increasing the attention around the focal point really helps to bring it all to life while telling the story in the shot.

Concept Artist Evan Whitefield

The squid-piloted robot was designed as a supporting but visually memorable background character in the Grand Council chamber. While not a primary character, it helped reinforce the sci-fi tone, scale, and advanced tech of the Galactic Federation. The design retained the creature-in-a-tank-helmet concept, evolving through multiple iterations to balance the charm of the original animated version with a more grounded, high-tech look for the live-action world.

When I first started exploring the design, my goal was to go all out with the initial concepts to really push the creativity and explore extreme ideas without limits. This helped uncover unique shapes and personalities for the squid-piloted robot. Once I had a strong range of options, I focused on pulling things back to create a more grounded, believable design that would fit seamlessly into the live-action world. That balance between bold exploration and practical refinement was key.

One of my favorite details is how the tank-like helmet functions as both a life-support system and a clear window into the squid’s personality, letting its expressiveness come alive. I also love the contrast between the squid’s relatively small size and its massive, bear-like robotic frame. The functional, mechanical design of the robot pairs beautifully with the organic shape of the squid, creating a compelling balance between technology and creature.

See the complete gallery of concept art from Lilo & Stitch here on ILM.com.

Learn more about the ILM Art Department.

Watch Lilo & Stitch on Disney+.

Discover Industrial Light & Magic’s role in helping inspire one of the world’s most iconic pieces of imaging software.

ILM executive creative director John Knoll and his brother Thomas recently joined Adobe’s Russell Preston Brown for a live recording of The Photoshop Archives at Lucasfilm’s San Francisco headquarters. Together, they discussed the origins of Adobe Photoshop, first created by the Knoll brothers in 1987 and acquired by Adobe the following year.

John Knoll had been hired at ILM in 1986 and soon began working the night shift as a motion-control camera operator. He also pursued an interest in computer graphics, then a rapidly expanding field quickly gaining traction in the realms of filmmaking and visual effects. Not long after he started at ILM, Knoll toured the ILM CG Department. The team had only recently been formed after the spin-off of the Lucasfilm Computer Division’s graphics group as “Pixar, Inc.” had left a vacuum for active work in the field within the company. The ILM team retained a fabled Pixar Image Computer, a groundbreaking image processor that had already been used to create a memorable stained glass knight in the ILM production, Young Sherlock Holmes (1985)

Knoll’s exposure to the Pixar machine yielded a glimpse of the future for visual effects, as he explains in The Photoshop Archives, and he quickly set about finding the means to create similar tools that could be adapted on more accessible computers available to the average consumer. He soon recruited his brother, Thomas, already an experienced computer programmer and scientist, to partner with him in the venture. Working on their own time while John continued in his role at ILM, the roots of Photoshop had been planted.

Soon after its debut, Photoshop was employed by ILM artists on James Cameron’s The Abyss (1989), which featured the all-computer graphics pseudopod creature. The software would continue to play an important role in helping the ILM team to innovate CG characters and worlds for many years to come.

Don’t miss the full episode featuring the discussion with John and Thomas Knoll on The Photoshop Archives.

And watch the ILM.com Newsroom for all the latest news and features.

Former Industrial Light & Magic artists join ILM.com to reflect on bringing the pre-digital cinema classic to life.

By Clayton Sandell

ILM modelmakers at work on the Inferno. L to R: Chuck Wiley, Barbara Galucci, Bill George, Randy Ottenberg (Credit: ILM).

During the summer of 1985, The Goonies hit movie screens and became an instant audience favorite. The timeless adventure tale follows a group of kids on a quest to discover One-Eyed Willy’s hidden pirate treasure, avoid a trio of ruthless family crooks, and save their homes (and way of life) in the “Goon Docks” of Astoria, Oregon.

While it’s not considered a massive visual effects film, part of the enduring charm of The Goonies is thanks to around 20 shots created by Industrial Light & Magic. Forty years later, four former ILM veterans share their memories about working on the celebrated classic.

ILM’s Michael McAlister was hired as the film’s visual effects supervisor, his first time in the role after working as an effects cameraman on projects including E.T. the Extra-Terrestrial (1982), Star Wars: Return of the Jedi (1983), and Indiana Jones and the Temple of Doom (1984).

Dave Carson brought extensive ILM experience to the role of visual effects art director on The Goonies, with credits including Star Wars: The Empire Strikes Back (1980), Dragonslayer (1981), and Star Trek III: The Search for Spock (1984).

The work of The Goonies matte painter and fine artist Caroleen “Jett” Green has appeared in dozens of films, including Willow (1988), Ghostbusters II (1989), and Star Wars: The Phantom Menace (1999).

Before a fruitful run as a visual effects supervisor, Bill George helped build a number of iconic models for films including Star Trek: The Motion Picture (1979), Blade Runner (1982), and Explorers (1985).The Goonies was directed by Richard Donner (Superman [1978], Ladyhawke [1985]) from a story by Steven Spielberg and a screenplay by Chris Columbus. Frank Marshall and the now president of Lucasfilm, Kathleen Kennedy, were among the producers.

The production team conducts a location scout on the Oregon coast (Credit: ILM).

MICHAEL McALISTER, VISUAL EFFECTS SUPERVISOR: Number one, Dick Donner was such a good man. His personality was so big, and he spoke with a booming voice, and he was just confident and gentle and kind. I was really impressed with him. It was a real joy to be around him. I also had good crews at ILM, and the experience of being on location in Astoria, Oregon, which is absolutely stunningly beautiful, was delightful.

DAVE CARSON, VISUAL EFFECTS ART DIRECTOR: It had so many effects shots in the first draft. I remember being in a meeting in Burbank early in the production. I don’t think Dick Donner was even there. And we were talking about the effects. And I said, “Well, I think eventually there’ll probably be like 80 shots.” The blood drained from everybody’s faces. I could see that was not where they were headed. It still was a great project, but the number of shots kept dwindling. The first draft had skeletons that came to life. It was full of effects and fantastic stuff.

I started just by drawing scenes from the script. Nobody asked me to, but you can’t read that script without wanting to draw some of the scenes in it. J. Michael Riva was the production designer, and he was cranking out beautiful stuff. [Art director] Rick Carter made beautiful blueprints. They were establishing the look of this film, and it was great. From that point on, my actual work for the production was pretty much taking established background plates and indicating where the effects would go. There wasn’t too much pie-in-the-sky stuff. I did a bunch of storyboarding of the sequence where the kids run into the cove, and they see some skeletons and they get on the ship.

Concept art by Dave Carson depicts the unfinished sequence when the Goonies are attacked by a giant octopus (Credit: ILM).

The ILM Model Shop built a highly detailed scale version of One-Eyed Willy’s sailing ship, the Inferno. Under the supervision of Barbara Gallucci, Bill George led a model-making team that included Randy Ottenberg and Chuck Wiley. ILM had plenty of previous experience with model spaceships, but building a wooden pirate galleon was something the crew had to learn from scratch.

BILL GEORGE, CHIEF MODELMAKER: I was really happy to be put on the project leading the construction of the miniature pirate ship. We wanted to do a good job and do something impressive that would get people talking. We put more into the model than we needed to. The production provided blueprints, which were amazing. We read books on building miniature ships and had the opportunity to do research and learn. We went to San Francisco Bay to study the Balclutha, which is a vintage wooden sailing ship. We studied all the details, the belaying pins, the rigging, the wood texture and wear. We wanted our model to look as authentic as possible.

We started with stanchions, very much the way you would build a boat. Those were covered in thin sheets of balsa wood. One of the big technical challenges on this was the rigging and the sails. Randy’s main focus was the sails. And, of course, there were no computer graphics that were advanced enough to do CG sails at that point. So the decision was made to make them out of a very, very fine silk, which would blow in the wind, and the silk was also great because it was transparent and pure white. Once again, we did some research. We found that we could use coffee and tea to stain the sails so they had a little bit of a warmer, aged color without stiffening the fabric. At the time Goonies came along, ILM had established itself as the visual effects house of choice for very successful films. Then there were all these films that Spielberg was producing, including The Goonies and Explorers and Back to the Future [1985], and all of them kind of funneled through ILM. It was a really exciting time because there was a whole diversity of interesting projects coming in.

Chief modelmaker Bill George at work on the Inferno (Credit: ILM).

MICHAEL McALISTER: It was unbelievably beautiful. But by the time the model was in the process of getting made, they decided to just go ahead and build the entire set on the soundstage. Which then meant that we didn’t need as many shots using the model.

BILL GEORGE: I was a little disappointed because we didn’t get to showcase it as much in the film. It was very backlit, and it was very far away, and I knew that the model could hold up. So it was a little bit of a disappointment. But I’m super proud of the model we built.

On deck, there’s even a little R2-D2 Easter egg. It was actually a casting from Star Wars. In the model shop, we had molds of the castings that go with the plug at the top of the X-wing starfighter. That’s what that was.

In 2023, the Inferno model was donated to the Academy Museum of Motion Pictures by Richard Donner’s widow, producer Lauren Shuler Donner.

The hidden R2-D2 figure from Star Wars tucked away on the deck of the Inferno (Credit: ILM).
Modelmaker Randy Ottenberg at work on the Inferno‘s masts (Credit: ILM).

Production designer J. Michael Riva had the Inferno and a water-filled cavern built as a full-size, practical set on Stage 16 at the Burbank Studios (now Warner Bros.) in Southern California.

MICHAEL McALISTER: I’ll never forget it. It was the most impressive thing I’ve ever seen in my entire movie career, hands down. The first time I walked on the stage, here’s this full-size pirate ship. And every little glorious detail was just striking.

The director of photography, Nick McClean, was going back and forth to another stage at the same time as he was trying to light this pirate ship, and it wasn’t working out very well. He just didn’t have all that much time to be on Stage 16.

So he just turned to me and said, “Michael, light it for me,” and walked away. I was like, “Oh my God, I don’t know how to light a set!” I was freaking out because I didn’t want to come up short. I didn’t want to disappoint him, didn’t want to embarrass myself. And I remember thinking, “How would you light it if it was a miniature, and just scale it up?” So that’s what I did.

You just got thrown into something, and you had to figure it out. So Nick came back, and he looked at my lighting, and he was pretty happy. Only changed one thing. I learned something about confidence, and I learned something about lighting. It doesn’t really matter how big a thing you’re going to light. It’s all the same idea.

The ILM team made visits to the Goonies sets in Burbank to capture reference photography. Here the Inferno and surrounding cave is under construction (Credit: ILM).

DAVE CARSON: It was an amazing thing to see. One morning on the set, there were probably a dozen of us all standing around drinking coffee, and Steven Spielberg walks in and he’s looking around. We’d met a few times, but he didn’t know me all that well. He says, “So what do you think?” I said, “There’s some great shots here,” and he says, “Oh yeah? Where?” I’m thinking, “Is he kidding me?” I was just trying to be conversational. But I decided I’d just follow through. So I walk over to the island with like twelve people following Steven, and I got down, just trying to find some interesting angles. I don’t know what he made of it all.

Visual effects supervisor Michael McAlister wades in the water tank on the Inferno set (Credit: ILM).

For wide shots of the Inferno, ILM artists Frank Ordaz and Caroleen “Jett” Green created matte paintings to help complete the illusion of a tall sailing ship rising beyond the limited height of the Burbank soundstage. Chris Evans served as matte painting supervisor.

CAROLEEN “JETT” GREEN, MATTE ARTIST: They had that big ship that they shot in a way that, at the last minute, they needed to extend the masts and add sails. We had to work quickly to make it all work perfectly.

The challenge was, we didn’t have much time, and the sails of a ship needed to have fluidity, an airy quality. Our matte painting extensions were static, so lucky for us the shots of the sails were only on for a couple of seconds.

I knew how to paint something realistically. What you also learn with matte painting is how to change lighting. You need to know what goes on with light, whether it’s indoors or outdoors, how it affects everything. If there’s a blue haze that’s moving in the shot, I might add some carefully mixed blue paint to match. It all got combined together.

I was an apprentice matte painter, learning the techniques and skills in order to become a great matte painter. I was working in a room with highly creative people, all excellent at what they do. I really wanted to keep up with these guys. And I told myself, well, “I’m just going to put in 150%.”

Another ILM contribution includes what might be considered an early example of a so-called “invisible effect.” Searching for their next clue, Mikey (Sean Astin) lines up a doubloon with cutouts to match rocks and a lighthouse in the distance. What appears to be a practical shot is actually a mix of multiple blue screen elements, background plates, and matte paintings. A complex rack focus helped complete the illusion.

DAVE CARSON: I remember the challenge at the time on the doubloon shot was they wanted the doubloon in focus and crisp up close. That means anything in the distance is going to be soft. So they had to pull off the rack focus in post-production.

MICHAEL McALISTER: One of the reasons that the shot was never attempted on set is because the rocks in the ocean didn’t exist. And they certainly didn’t exist to line up with the doubloon. So, based on that criteria, it automatically became a visual effect. And dealing with the rack focus was very challenging during that time because it was all optical printer composites, and you didn’t get good mattes out of blurry edges in the optical process. Today, it’s not an issue with all the CG capabilities and the compositing software, but it was a challenge at the time to get that right.

A storyboard by Dave Carson (Credit: ILM).

The organ chamber sequence – in which an incorrectly-played musical note causes part of the floor to fall away and reveal a treacherous cavern below – was achieved using five different matte paintings and a 16-by-20-foot miniature set featuring stalactites, pools of water, and fog. The original set was scaled down in size during pre-production, posing a challenge for creating the critical illusion.


MICHAEL McALISTER: The concept was supposed to be something that instantly communicated absolute death if you fell down there. That was one of the hardest things I’ve actually ever done in my career, creatively. And to this day, I’m not really happy with what that image communicates because it didn’t look like instant death to me. Richard [Donner] and [Steven] Spielberg didn’t ever complain to me about it, but I wasn’t really happy with that. It was supposed to be all misty and foggy, which made the lighting so diffuse that it was just really hard.

The ILM camera crew prepares to shoot the miniature from the ground up. A mirror was used for reference while standing (Credit: ILM).

Four decades later, The Goonies continues to be treasured by fans young and old. In 2017, the Library of Congress added the title to the National Film Registry, which honors movies with cultural, historical, or aesthetic significance.

DAVE CARSON: It’s so funny. Of all the films I’ve worked on, when people find out I worked on The Goonies, a lot of times that’s the one that they’re impressed by. “Oh, you worked on The Goonies? I love that movie!” Yeah, it’s still a very popular film.

BILL GEORGE: The story reminded me of when I was a kid with my buddies, and we were looking for adventure on the street, throwing dirt clods, that kind of stuff. It really captured the essence of that in a really magical way. And I think for kids that age, they’re like, “Hey, let’s make this happen. Let’s find the treasure.” Goonies have a special place in our hearts.

CAROLEEN ‘JETT’ GREEN: We were all seriously into what we were doing: matte painting.

I considered many of the artists geniuses. Just a brilliant group of creatives. We would start painting at around 10 o’clock in the morning and go into the zone of silence for hours. Then we’d come up for air at the same time, lunchtime or later. At times, I would even stay until sunrise. 

MICHAEL McALISTER: It is meaningful to me that there are a few films that I’ve worked on that are classics and will always be remembered. During The Goonies, I had a hunch about it because every kid dreams about finding a pirate ship and a pot of gold. I can’t take any credit for the fact that these movies have such legacies, but it’s nice to have been involved with a movie that made such a dent and endures.

When I first walked the halls of ILM, I realized I was walking among the best in the world at what they do. It was just such a privilege to be in that company, in the company of those artists, that level of creativity and expertise for so many years.

A doodle by an ILM crew member on the Inferno model during its construction (Credit: ILM).

Clayton Sandell is a Star Wars author and enthusiast, Celebration stage host, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell), Bluesky (@claytonsandell.com), or X (@Clayton_Sandell).

Industrial Light & Magic has unveiled a new trailer and key art for the podracing adventure that’s coming this fall.

Industrial Light & Magic and Lucasfilm announced today that Star Wars: Beyond Victory – A Mixed Reality Playset, the next groundbreaking entry in interactive Star Wars storytelling, will launch on October 7, 2025, exclusively for Meta Quest 3 & 3S headsets.

“This experience is designed to celebrate storytelling, action, imagination and everything we love about Star Wars,” said director Jose Perez III. “We wanted to give players a new way to step inside the galaxy and make it their own.”

Watch the new trailer below:

Visit StarWars.com to learn more about the exciting new voice cast and distinct gameplay modes for Star Wars: Beyond Victory.

Don’t miss all of the latest updates from Industrial Light & Magic on the ILM.com Newsroom.

ILM’s Mohen Leo and Scott Pritchard and Lucasfilm’s TJ Falls are among the winners for “Outstanding Special Visual Effects in a Season or a Movie.”

The team from Andor pose in the press room with the award for outstanding special visual effects in a season or a movie during night one of the Creative Arts Emmy Awards on Saturday, Sept. 6, 2025, in Los Angeles. (Credit: Richard Shotwell/Invision/AP)

The 2025 Creative Arts Emmy Awards took place on September 6, and Lucasfilm’s Andor series took home four wins, including “Outstanding Special Visual Effects in a Season or a Movie.” Industrial Light & Magic’s Mohen Leo – who served as Andor‘s production visual effects supervisor – took home the award along with ILM visual effects supervisor Scott Pritchard and Lucasfilm’s visual effects producer TJ Falls.

The other Emmy recipients for “Outstanding Special Visual Effects in a Season or a Movie” include special effects supervisor Luke Murphy, special creature effects lead Neal Scanlan, Hybride visual effects supervisor Joseph Kasparian, Scanline visual effects supervisor Sue Rowe, In-House visual effects supervisor Paolo D’Arco, and digital colorist Jean-Clément Soret.

Congratulations to all of our Andor winners! Visit StarWars.com to see the full list of recipients.

Learn more about ILM’s work on Andor here on ILM.com:

“Like Eating an Elephant One Bite at a Time”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

“Let the Experts Be the Experts”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

Assembling a Starfighter: Exploring ILM’s Role in Creating the TIE Avenger from ‘Andor’

ILM’s Mohen Leo, production visual effects supervisor of Andor, attends the Governors Gala at the 2025 Creative Arts Emmy Awards (Credit: Invision/AP).

The ILM visual effects supervisor speaks on ILM’s contributions to the blockbuster film that brought Marvel’s First Family into the Marvel Cinematic Universe.

By Jay Stobie

(Credit: ILM & Marvel).

Marvel Studios’ The Fantastic Four: First Steps (2025) transports audiences to the Marvel Cinematic Universe’s Earth-828, where Reed Richards (Pedro Pascal), Sue Storm (Vanessa Kirby), Johnny Storm (Joseph Quinn), and Ben Grimm (Ebon Moss-Bachrach) must prevent Galactus (Ralph Ineson) and his herald Shalla-Bal (Julia Garner) from destroying their entire planet. Directed by Matt Shakman, whose acclaimed credits include helming episodes of the long-running comedy series It’s Always Sunny in Philadelphia (2005-Present) and the mystical Disney+ hit WandaVision (2021), The Fantastic Four leans into a retro-futuristic aesthetic that blends 1960s-inspired designs with out-of-this-world technologies.

With this innovative endeavor in mind, the filmmakers called upon Industrial Light & Magic and its accompanying half-century of visual effects expertise to help execute Shakman’s vision, with a particular focus on The Thing, Galactus, the climactic third act battle in New York City, and more. Daniele Bigi (Ready Player One [2018], Star Wars: The Rise of Skywalker [2019], Eternals [2021]), who served as the ILM visual effects supervisor on The Fantastic Four, sat down with ILM.com to discuss the company’s numerous contributions to the project, from devising a fresh approach for portraying The Thing’s rocky features to constructing Earth-828’s distinctive New York City skyline.

An ILM Overview

As the ILM visual effects supervisor on The Fantastic Four, Bigi spearheaded ILM’s involvement on the project from the company’s London studio, working closely with invaluable colleagues like ILM animation supervisor Kiel Figgins and ILM senior visual effects producer Claudia Lecaros. “In this case, ILM didn’t split the work between multiple ILM facilities, so my team ended up keeping all the asset and shot work in London. We were assigned the major task of handling the third act of the movie, which centered on the final battle between the Fantastic Four and Galactus,” Bigi tells ILM.com. “Although it’s divided into multiple sequences, the third act is a continuous narrative from Galactus’s arrival on Earth through the end of the film. It was a fascinating and important piece of work to deal with.”

ILM’s assignment included devising an innovative look for Ben Grimm’s iconic alter ego, The Thing. “We did all of the initial development with [production visual effects supervisor] Scott Stokdyk and [visual effects producer] Lisa Marra from Marvel, in collaboration with [head of visual development] Ryan Meinerding. Ryan provided us with the concept for The Thing, which is what we based our work on,” Bigi relays. As the leading vendor for The Thing, ILM developed the entire character and then distributed the asset to the film’s other visual effects vendors for their own sequences.

(Credit: ILM & Marvel).

“After the initial development of The Thing, we were assigned another prominent character to build. Since ILM had several shots in which Mister Fantastic stretched his body and used his ability in an extreme way during the final battle, ILM ended up leading the look development of Reed Richards, too,” Bigi explains. In January 2025, ILM’s success with these character creations prompted Matt Shakman to task Bigi’s team with crafting the Fantastic Four’s immense nemesis, Galactus.

“Another big component to ILM’s work was the development of New York City, which was an imaginary version of it based on Marvel concept art,” Bigi continues. “Roughly 90% of the New York City shots were done in computer graphics by ILM. It’s a 1960s futuristic New York, and while certain aspects appear exactly like our New York, there are many buildings and stylistic elements that reflect both 1960s and futuristic designs. A large section of the city, including Times Square, was ingested from Sony Pictures Imageworks, whom ILM collaborated closely to combine different city blocks into a unified layout with a matching style, color palette, and overall look.” Most of the city set-up was handled by environment supervisor Stacie Hawdon and CG supervisor Tobias Keip at ILM’s London studio. In total, Bigi estimates that ILM contributed between 350 and 380 shots to The Fantastic Four.

Thinking the Thing Through

“At ILM, we aimed to deliver on Matt Shakman’s vision by dramatically changing what had been done with The Thing in the past. We sought to create the most believable, realistic performance that would respect Jack Kirby’s original design, from the size of the rocks to the very specific rock formation of The Thing’s brow,” Bigi shares. Animating facial expressions for a character whose face is composed of rock proved to be a considerable challenge. “We explored different options, but I always wanted to keep the rocks as rigid as possible. If we started to squash and stretch them, The Thing would resemble what was done in the past with plastic material and foam prosthetics.”

(Credit: ILM & Marvel).

Leaning into The Thing’s bouldery frame, Bigi’s team created small, undefined gaps between the rocks. “Depending on the expression, we could move the rocks in these minuscule spaces. Additionally, we allowed the rocks to gently stretch in areas that were invisible to the camera, giving us larger gaps that let us keep the rest of the rocks completely rigid.” ILM employed another sophisticated technique for The Thing’s face and body, running an effects simulation on the rocks rather than dealing with geometric skinning. Bigi praises FX and creature technical director Maybrit Bulla, who used Houdini to create a custom setup to control the collision between the rocks. “We used our blend shape technology to move the underlying surface, but there are rocks on top of it that are actually colliding. They push each other and land in a natural position. In some shots, we had to guide the simulation in an artistic manner to avoid having rocks go into unwanted territory and seem weird or strange. The process is something new that we developed for this movie.”

Ebon Moss-Bachrach as The Thing (Credit: ILM & Marvel).

When it came to actor Ebon Moss-Bachrach’s performance capture for The Thing, ILM referenced the work-in-progress geometry data from Digital Domain (another effects vendor on the film). “The data was useful for the initial stages and the blocking animation, but when we started to go into the minutiae with Scott Stokdyk and Matt Shakman, we ultimately worked on our own system and reanimated the character for our final animation,” Bigi details, crediting CG supervisor Marco Carboni for developing a workflow to quickly ingest data from Digital Domain and transfer it to ILM’s proprietary facial rig.

Rules for Reed Richards

Alongside Shakman, ILM outlined clear guidelines for Reed Richards’s capabilities as Mister Fantastic. “Matt was keen to avoid creating what we called a ‘noodles’ or ‘spaghetti’ feeling. How we controlled the stretch was unique and based on Matt’s vision,” Bigi recalls. “Instead of developing the character for months and then realizing that it didn’t behave in the right way, I proposed exploring various 3D action poses with extreme body stretch from several angles. Matt was incredibly receptive to the notion of rendering these static frames before having a functional rig or muscle simulation for the animator to use.”

Setting rules for Mister Fantastic became essential to ILM’s process. “What can Reed do? Do we want to stretch the neck, or don’t we? We decided not to, so there’s not a single shot where you see the neck stretching a lot,” Bigi notes. “We established a rule that only Reed’s limbs would stretch, meaning his upper torso and shoulders would remain the same width as the actor’s. Another rule dealt with his bone structure. While stretching, his elbows and knees would be more defined, the idea being that the skin was getting thin and wrapping around the bone. This was all discussed with Matt and Scott and developed in the initial stage where we did our 3D maquette action poses.”

(Credit: ILM & Marvel).

Bigi took inspiration directly from Marvel’s comic books, as well. “Many comic book artists before us, in particular Alex Ross, maintained a very strong V-shape when portraying Reed’s upper body. So, in the ILM shots where Reed is stretching, we kept the lat muscles on his body fairly large, like an athlete or swimmer,” Bigi declares. “We also decided Reed would snap his limbs back to a natural pose relatively quickly. The thought was that it wasn’t easy for Reed to stretch, so he would only do so on important occasions. He doesn’t do it for fun, at least in this movie.”

While Reed’s arms and legs stretch extensively, Bigi points to another key decision ILM made when generating the look and feel of Mister Fantastic. “The stretch of his fingers is minimal, and the gloves you see are usually the normal size as established by the practical costume designer. The concept being that, unlike the fabric close to his body, the actual fabric of the gloves didn’t need to stretch at all.”

Seeing Sue Storm

As was the case with The Thing, ILM pursued a unique path to conveying Sue Storm’s abilities in the final battle. “Rather than relying on particle simulation, all of ILM’s Sue effects were based on optical elements,” Bigi reflects. “The Sue effects were meant to be analog, in a way. There are no effects simulations of any kind. Most of those shots were crafted by ourcompositing team, so it’s a 2D-based approach using references of how lenses naturally create refraction and color variation. You see that we enhanced and exaggerated the prismatic fringes that occur with specific types of lenses.

(Credit: ILM & Marvel).

“Although this route was simple in a technological sense, it was nevertheless quite effective visually, and blended well with the atmosphere of the movie,” Bigi concludes. “Going with the latest, state-of-the-art technology is not always the answer. In this case, it was the opposite. We wanted it to feel simple and analog, so we stayed with the real optical effects. It’s all about what the director wants and the feeling you wish to convey.”

Grappling with Galactus

Unlike the challenges that ILM tackled with The Thing’s rocky features, the surface of Galactus’s face resembled the actor to a much greater extent. “We were able to use Ralph Ineson’s performance through a normal blend shape technique for Galactus’s face. Matt wanted to infuse Galactus with a god-like aspect, so he had us downplay the realistic human aspect and micromovements of the actor’s face. We reduced the range of motion and kept the face a bit firmer,” Bigi states. “For the body, we received a scan of the beautifully-constructed costume, but at the end of the day, ILM replaced it with CG in all of our shots because of its need to appear metallic.”

(Credit: ILM & Marvel).

Representing Galactus’s true scale also came into play. “We determined a specific height for Galactus, so the camera had to conform to that size. There are several shots with plate photography, but the majority was done digitally, especially due to the interaction between Galactus and the city,” Bigi reports. “Galactus’s body had to be covered with thousands of tiny lights, which couldn’t be done realistically with prosthetics, and he’s so large that the amount of detail necessary to set the scale was tremendous. We scattered literally millions of tiny pipes, greeblies, and geometric objects to increase the sense of scale. At a distance, our Galactus was the same as the costume, yet it was much more elaborate in the extreme close-ups.”

(Credit: ILM & Marvel).

ILM held conversations with Matt Shakman and Scott Stokdyk about the bridge devices that serve as a centerpiece for the climactic conflict with Galactus. “We developed an effect that we called ‘bridge effects,’” Bigi notes. “The bridge is an amazing device that – spoiler alert – Reed conceived to transport Galactus to another location in space. Because of the 1960s style of the movie, we avoided a digital quality for the portal. We found references and simulated optical effects rather than calling upon inspiration from the digital world. It was a real brainstorm with Matt and Scott. All sorts of ideas, such as having Galactus’s body stream with particles inside the bridge effects, came up in our conversations with Matt.”

A “New” New York

In preparation for depicting Earth-828’s New York City, Bigi traveled to New York for a 10-day shoot with The Fantastic Four’s second unit. “It was an amazing experience,” Bigi beams. “Based on the previs, there were certain shots we knew would be CG, but we tried to film as much as possible. Before going to New York, I used a combination of Google Earth and other digital resources to virtually scout Manhattan and propose methods to capture it from specific locations in a thorough fashion. I spent days capturing 360 HDRI panoramic views, mostly along 42nd Street, that construct a library of texture and material references. At the same time, a small team from Clear Angle Studios scanned the entire road using a LiDAR [Light Detection and Ranging] scan.”

The work continued upon Bigi’s return to London. “Initially, we took the images of New York and removed all the buildings constructed after the 1960s. It was essentially a filter that permitted us to show this version of the city to Matt and Scott,” Bigi remembers. “Then, in collaboration with [production designer] Kasra Farahani and Scott, we drew inspiration from futuristic-looking buildings elsewhere in America, such as Chicago. We selected preexisting real-world buildings that had rounded shapes and concrete bases. Another selection was done by concept artists at Marvel who had come up with original designs.

(Credit: ILM & Marvel).

“My team at ILM modeled those buildings, and we set their number and location along the street. We built several layouts and versions, gradually shaping the features of the street. That aesthetic relied on the props, as well,” Bigi asserts. “The cars and billboards resemble those from the 1960s, and we scattered spherical water tanks around the city. The phone booths aren’t based on their 1960s counterparts, as they were designed specifically for the movie. From the skyscrapers down to minute details like the color of the phone booths, everything is either a combination of real 1960s references or the artistically-driven futuristic elements that are now synonymous with the film.

”The time and talent that ILM invested in The Fantastic Four has paid off for both the artists involved in the project and audiences around the globe. Upon seeing the final cut, Bigi gravitated towards one of ILM’s shots when ranking his top stand-out moments from the project, declaring, “There are several moments that I love, but for me, Galactus emerging from the water and entering Battery Park from the river is my favorite. The water simulation and the composition combine to create a wonderful shot to begin that sequence.” Applauding the work of compositing supervisor Juan Espigares Enríquez and his compositing team, Bigi concludes, “I think it’s one of The Fantastic Four’s most exciting and spectacular moments.”

(Credit: ILM & Marvel).

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Lucasfilm announces the full cast for the Star Wars feature directed by Shawn Levy and coming to theaters in May, 2027.

Ryan Gosling (left) and Flynn Gray on set for Star Wars: Starfighter (Credit: Lucasfilm & Ed Miller).

Production of Star Wars: Starfighter is officially underway with a newly-revealed cast alongside Ryan Gosling that includes Flynn Gray, Matt Smith, Mia Goth, Aaron Pierre, Simon Bird, Jamael Westman, Daniel Ings, and Amy Adams.

“To join this storytelling galaxy with such brilliant collaborators onscreen and off, is the thrill of a lifetime,” director Shawn Levy told StarWars.com. Levy joins Lucasfilm president Kathleen Kennedy as producer on Starfighter, with a screenplay by Jonathan Tropper.

Visit StarWars.com to read the full announcement with all the latest news.

Learn the inspiring story of a group of dedicated ILM and Lucasfilm employees who turned their dreams into a reality….

By Lucas O. Seastrom

“It is a period of intrigue,” begins the opening text crawl of a little-known Star Wars story. “Imperial outposts scattered across the galaxy search for signs of the fledgling rebellion. Intrepid rebel agents have infiltrated the Empire’s Logistics, Deployment, and Allotment Center – LDAC – on planet Yerbana. From the heart of this Imperial base, the rebels have carved out a secret outpost to monitor enemy activity, contact other rebels, and recharge all the while avoiding the eyes of the Empire….”

No, it’s not an outline for a future Star Wars movie. It is, in fact, a playful backstory imagined by a passionate group of Industrial Light & Magic and Lucasfilm employees to contextualize a unique gathering place deep within the company’s San Francisco headquarters

Not far from ILM’s performance capture stage and IT department, a massive, Imperial propaganda poster hangs on the wall.

Careful observers may discover that the poster is, in fact, a door. A gentle nudge to the left reveals a secret hideout straight from a galaxy far, far away…

(Credit: ILM).

The “Rebel Hideout,” as it is formally known, is presented as a makeshift refuge for hidden operatives, tucked away inside an Imperial base (its acronym, LDAC, “coincidentally” the same as the former name of Lucasfilm’s headquarters, Letterman Digital Arts Center). Pieces of furniture, equipment, and technology from across the galaxy have been requisitioned and adapted into the relatively cozy space. At times, sirens blare a warning, or the roar of a spaceship’s engines is heard overhead. Monitors allow the rebels to spy on Imperial activities, all the while enjoying a well-stocked snack bar and most impressive music system (the room even sports a “disco mode”).

The result of more than a decade of committed planning and effort from a passionate group of employees, the Rebel Hideout is a homemade lounge designed specifically as a fun-filled gathering place. “The fact that this is employee-created is the most joyous part of the entire thing,” notes staff R&D engineer Mike Jutan, one of the project’s originators. 

“We’re not bragging about this room,” adds ILM’s head of CG, Michael DiComo, another project lead. “We are as amazed as anyone. How in the world did it get this good? We had dreams, and we’re so happy to be part of it, but we couldn’t have guessed it would be this great because everyone went so over the top. It’s the ILM spirit of everyone collaborating, doing more, and asking, ‘What do you need me to do?’”

The story of the Hideout’s creation has as many twists and turns, trials and triumphs, as any Star Wars movie….

The Early Concepts

Before joining ILM in 2007, Mike Jutan interned at Pixar Animation Studios, just across San Francisco Bay in Emeryville. The company was already well-known for its playful, collaborative culture in a space designed for casual interactions between employees. Its campus even featured “secret” lounge areas, often with elaborate theming, including one located inside a compact air duct only accessible via a crawl space. 

“At Pixar they talk about the central atrium in the building,” Jutan tells ILM.com, “and how Steve Jobs designed the space with the bathrooms in the center of the building so that everyone has to cross paths with each other, and when you do, you strike up random conversations. Just the idea of artists and engineers unintentionally crossing paths was really captivating to me.”

Arriving at ILM’s San Francisco office, an enthusiastic Jutan was eager to help introduce more of these like-minded spaces at the Letterman campus. It was a poignant vision, considering that Pixar had originally took inspiration from ILM’s former home at the Kerner facility in nearby San Rafael, where the Lucasfilm Computer Division (from which Pixar spun off in 1986) was once fully integrated within the visual effects company’s stimulating, creative work setting, where casual “hallway meetings,” employee band performances, and Friday afternoon “ergo parties” were regular occurrences. 

An early concept for an employee lounge by Chris Bonura took on a fully Imperial theme (Credit: ILM & Chris Bonura).

“It all started when we asked for $50.00 to retouch the pool cues,” Jutan reflects with a laugh. Pool tables were one amenity in what he describes as a “multipurpose room” that existed on the Letterman campus, where employees often played games, relaxed, or held art classes. Jutan wanted to find ways to improve the space with specific theming.

By 2013, a small group of ILMers, including Jutan and DiComo, had originated a concept for a Star Wars lounge designed to evoke the Mos Eisley Cantina. An early proposal detailed a space that “represents our company personality” and facilitated “casual, collaborative conversation.” They were soon meeting with Lynwen Brennan, then ILM’s general manager and today Lucasfilm’s president and general manager. “We explained a concept where we could scrounge together some different items to make a little lounge space,” recalls DiComo. “And she told us, ‘No, dream big. How great could this be?’”

The group broadened their ambitions. Current day real-time principal creative Landis Fields produced a digital walkthrough of an elaborate space that felt like an immersive, Star Wars movie set. Soon they organized a lunchtime event with over 100 employees to introduce the concept, exchange ideas, and recruit volunteers to help create it. “This was going to be a crowd-sourced, employee-driven project,” as DiComo notes. New contributors, like senior R&D engineer David Hirschfield, became involved at this stage.

Stay on Target

Over a period of more than five years, the cantina lounge concept was met with enthusiasm but was ultimately delayed. Availability of time, resources, and an adequate space shifted repeatedly due to business priorities. Then the global pandemic changed not only the prospects of building such a lounge, but the entire landscape of collaborative work in general. 

The concept might’ve been abandoned during such a transformative period. But in the wake of the pandemic, the small group that had envisioned the cantina became only more committed to finding a way to bring people back together in person, and ILM and Lucasfilm’s leadership took notice. With new plans to renovate the San Francisco workspace, the lounge team was given a dedicated room and a small budget to build something new. 

“Lynwen and the Lucasfilm leadership team were kind enough to help us do it,” Jutan explains. “We were prepared to rally the troops and have everyone there to make it happen. Our chance finally came, and we were going to take full advantage of it. We spun up again as if nothing had ever paused.”

With the new opportunity came a new concept for another type of Star Wars setting. A “hidden room” was discussed, and once again, Landis Fields created a digital walkthrough. From outside, a bookcase full of Imperial volumes and trinkets hid a small, sliding window where a Rebel inside ensured you were clear to enter. The case then moved across the wall to reveal a space that greatly resembled the ultimate creation. 


“Landis sort of kit-bashed all of these different Star Wars ideas into the design to see what might be possible,” DiComo recalls. The group shifted from an explicit speakeasy concept to something more inclusive. “This is a place where you can come get coffee, take a quick break, have a meeting, enjoy some music, read a book, or sit around and work on your laptop,” notes Jutan. “There’s no ‘one size fits all.’ People want different things out of a space like this, but the common thread is bringing people together.”

By 2023, production manager Julie Stallone had joined the team as project producer, handling daily logistical needs, coordinating a schedule, and ensuring the team stayed on budget. They proceeded with an initial plan to construct some furniture and design elements on their own and collect licensed prop replicas and toys to provide accents. They would also partner with an independent replica maker, Tom Spina, for additional features. But a fortunate surprise caused yet another change of course.

From the Set to the Hideout

Learning about the Hideout project, Lucasfilm archivist Portia Fontes contacted the team offering use of a number of screen-used props and set pieces from recent productions, including Rogue One: A Star Wars Story (2016), Solo: A Star Wars Story (2018), Star Wars: The Rise of Skywalker (2019), and Andor (2022-25). The group’s response was a collective “WHAT?!” “We could never have built anything as amazing as these set pieces with the time and budget available,” says DiComo. 

Fontes collaborated with Colin Merchant, a member of the Andor Props Department in England, to curate a selection of pieces. Soon after, senior producer John Harper and the Lucasfilm Online team offered additional, custom-made furniture from their livestream stage at a recent Star Wars Celebration, as well as additional screen-used props from Star Wars productions. “It was another spit-take moment,” as Jutan quips.

Mike Jutan (standing) and David Hirschfield inspect original props from a Star Wars production (Credit: ILM).

These developments would define the Rebel Hideout’s ultimate design and layout. Lucasfilm director of franchise content and strategy, Pablo Hidalgo, helped the group refine the room’s specific context within the Star Wars storyline, ultimately writing the Hideout’s “opening crawl” and related notes. 

“When we realized that we had all sorts of different props available to us, having a specific story structure helped limit what items we would use,” Hidalgo explains. “Some things fit the story, and some didn’t. From there, we built out the story that fit the context of this room and aligned with the characteristics of being rebels. It’s in the classic trilogy era, with the Rebellion and Empire. We also use some sequel trilogy props from the Resistance and First Order if they have a timeless character to them.”

A group, including Jutan, Hirschfield, and Hidalgo, reviewed prop lists and visited an off-site storage facility to consider their options. They chose everything from a Hammerhead Corvette pilot’s chair seen in Rogue One to an Imperial security camera from Andor. Other set pieces included Babu Frik’s work table and part of a Star Destroyer workstation, both from The Rise of Skywalker, a tactical screen divider from the Yavin 4 base in Rogue One, and wall panels from the Aldhani garrison in Andor

A major discovery was the Millennium Falcon’s own tech station as seen in its original form when Lando Calrissian (Donald Glover) owned the ship in Solo. “As soon as we found the console, it just looked incredible, and it wasn’t even lit up yet,” recalls Hirschfield. “I was smitten. It was pretty big, but we had to use it.” Some balked at the size, wondering how it could fit into the already tight space, but Hirschfield insisted, suggesting it could be outfitted as a control for different room functions.

The tech station from the Millennium Falcon as it appeared in Solo: A Star Wars Story (2018) (Credit: ILM).

“An Integration Challenge”

The team created a top-down rendering of the room with accurately-scaled pieces for furniture and components, which Jutan and Hirschfield call “dollhousing.” Space was key, both in terms of packing in as many authentic details as possible and for maintaining appropriate room for access and flow. Some items, like a giant vaporator, were cut during this phase due to limitations. 

“We liked the basic layout, but it still looked like a conference room where we were stashing Star Wars stuff,” recalls DiComo. “How do we make it look like an actual Star Wars room? It became an integration challenge. We needed to make everything fit into the room and look as if it had always been there. We’d use tubes, hoses, weathering, grime, and greeblies on the wall to help blend it together. We moved the panels into place; the couch didn’t fit, so we cut the back off; we turned one of the radar screens on its side to help separate the rooms; and then that allowed room for the Millennium Falcon station.”

The couch was a former Star Wars Celebration prop, designed to look like a kyber crystal crate as seen in Rogue One. As they prepared to cut off the detailed back portion to allow it to sit flush with the wall, DiComo had an idea. “I thought, ‘We can’t get rid of the back part of this couch!!’” he explains. “It had giant hinges on it and an awesome shape and Star Wars greeblies all over it. So I suggested that we mount it on the wall in the kitchen, almost like it was this waste retrieval or energy storage system. All of the tubes and hoses go inside it. What would’ve been scrap became this amazing piece on the wall.”

By this stage, Kyle Johnson, a former workplace services specialist with Lucasfilm, had become an integral member of the team, whom they call their “secret weapon.” An experienced craftsman, designer, and set builder, Johnson took the lead in rebuilding, fabricating, and installing a number of details throughout the room, not least of which included a massive Imperial archway that forms a key focal point, helping to break up the standard 90-degree angles in the space. 

“We had to make everything actually work together in the room where people could sit on something, bump into something, or push the buttons on something,” Johnson says. “How do we cut everything down to make it fit well in the space? Doing traditional set work, you can position things all over the place so long as it looks good in the shot. With this, you have to make it work with a purpose that fits in with the rest of the room, and it has to look really good.”


Johnson also developed the Hideout’s entryway. The original bookcase concept was adapted into a sliding door concealed as an Imperial propaganda poster, which Johnson designed in collaboration with Hidalgo, concept artist Katarina Kushin, and the Lucasfilm Art Department. The poster slides open to reveal a custom-made Imperial doorway. “You get the feeling like you’re entering another place, not just going through a door,” Johnson notes. “It’s 18 inches deep with lights on the side. You’re moving through a narrow tunnel, almost.”

Bringing the Room to Life

Another critical step in creating the Hideout was building and installing an electronic system to run lighting, sound, and video. Members of the ILM Electronics Club would take the lead for a number of these tasks, including systems engineer Trent Bateman and layout supervisor Tim Dobbert. 

“My first project was helping to get the Millennium Falcon console online,” says Bateman. “Tim and I came in to figure out how to get the lights to simply work. We tried lots of different methods and couldn’t figure it out. Tim suggested that maybe these props were wired differently because they were made in the United Kingdom. So then we got it to light up, and we determined what the right voltages were. From there, we had to figure out how to make everything work. We developed several iterations of the circuit boards to find the best way to light everything without blowing the circuits or frying the micro-controllers. We had to build a network so that everything could talk to each other. That took about two months. Once that was in place, we replicated and grew the system across the entire room.”


Head of the Visual Effects Editorial Department for ILM’s San Francisco and Vancouver studios, Lorelei David, had joined the Electronics Club with no prior experience in the field. Mentored by her colleagues, she joined the Hideout team to help lead the hardware creation. 

“Trent had the micro-controllers, and he designed these beautiful, custom circuit boards,” David says. “We realized it needed a lot of soldering, more than we could do on our own. That’s when we decided to have a ‘soldering party’ in a big conference room at the office. We invited anyone in the company who wanted to learn. We provided instructions, references on a screen, and examples for them to look at. It was a great opportunity to meet a bunch of people from across the company.”

Employees gather for a “soldering party” to help create the many circuit boards needed throughout the Hideout (Credit: ILM).

Media systems engineer Paul DeBaun, R&D principal engineer and architect Nick Rasmussen, and former AV design engineer Greg St. Germain took the lead on the audio and video systems, equipping iPads throughout the room to run everything from graphic displays to location views to static (all authentically sourced from Star Wars films). In addition to ship flybys and music features, the Hideout also includes a standard ambient soundtrack with various technological effects. 

“We built a 10-channel speaker system out of old parts that we had on hand, including two subwoofers,” DeBaun notes. “We ran the wiring for the system throughout the ceiling and used QLab to run everything. We also worked on the dimmer packs for the lighting. There are about 36 channels of adjustable lighting patterns throughout the room.” Hirschfield adds, “We were perfectly willing to beg and borrow as much as we could. We had a budget, but we also kept finding these opportunities to adapt existing tools to make the room better.”

It’s All in the Details

Senior lighting artist Rebecca Forth had been leading set decoration since the beginning of the year, and now her sub-group worked to paint and detail props out in the hallway. “For me, this project felt similar to what it must’ve felt like to work in the ILM Model Shop back in the day,” Forth comments. “It felt like this was the closest I’d ever get to that tactile experience of putting your hands on everything, and making little mistakes that actually turn into a great feature. It was about having everyone together and trying to figure out how things would function.”

One of the last major steps for integrating details in the room involved a weekend visit from Robb De Nicola, Patrick Louie, and Max Frey from the Tom Spina Designs crew. They brought their expertise in custom design, painting, and weathering to help boost the team on their final push. 

The crew at work installing set pieces, electronics, and other features (Credit: ILM).

Among the dozens of minute tasks the Spina artists performed was taking a snowtrooper helmet (one of many replica props donated by Lucasfilm’s licensing team) and aging it. “They spent maybe 30 minutes and came back with this snowtrooper helmet and had a wampa slash across the face, it was painted and weathered with dirt,” Jutan says. “Their skill was nuts.” Throughout their visit, they assisted the Hideout team in mining their leftover set pieces for greeblies and details that could be adapted into features for the room. “The lesson from that is we can use anything,” says DiComo. “If you mix things together the right way, it has that Rebel, scavenged vibe.”

“For the most part, we wanted to restore each item to look as it did on screen, as well as give it a longer life,” Forth explains about the many screen-used props incorporated into the room. “Many of these were built to last for the length of the shoot. One of the tougher projects was the pilot’s chair from Rogue One. There were some wooden pieces that had shattered, which we had to carefully glue back together. The cushions had been adhered with glue tape that was disintegrating, so we put in new velcro strips that could last much longer. The headrest had some missing hardware that needed to attach it to the base.

“We didn’t want to remove any intentional dirt or weathering from the items,” Forth continues. “This stuff had been in storage, so it had some real dirt and dust, but then it also had intentional dirt. You had to be really careful when you were cleaning so that you didn’t take that off. It was about restoring the piece and keeping intact the initial intention and craft of the artist who’d put it together.” 


Practically everything, save one Andor piece dubbed “Admiral Snackbar” by the team, required extensive reconstruction or modification. “It’s all MDF [medium-density fibreboard] and staples,” notes Johnson. “There were a lot of weird structural requirements involved to make things durable and functional. We had to tear stuff apart and reframe it to support more weight.”

An entirely original piece in the Hideout was an industrial-style fan in the ceiling, an idea that dated all the way back to Field’s original safehouse concept. “I really wanted to make sure that something was moving,” says DeBaun. “Most everything in the room is still, except for the fan.” Using a fan acquired by Hirschfield and a cover made by Johnson, DeBaun mechanized the piece to spin gently, adding tubing and related detail inside the fan’s housing. As an accompaniment, he went to even greater lengths to create a self-described “impossible shadow” on the floor. 

“We can’t physically make that shadow in the space because of the height of the ceiling,” DeBaun explains. “So we project the shadow to spin in time with the fan’s rotation position so that it matches.” The result is a subtle but poignant accent that pulls the room’s many details together in a believable way. “It has a lot of capability to tell a story,” DeBaun notes, who hopes to incorporate a new passing shadow effect within the year. “No one talked about doing all of that,” adds DiComo. “Paul just dreamed it up and did it. A rudimentary sketch became this unbelievable thing.”


The Hideout team also knew that they wanted a unique insignia for their space, another project that Johnson took the lead on, working with associate producer Michelle Thieme, executive design director Doug Chiang, and other members of the Lucasfilm Art Department. Inspired by the shape and silhouette of the iconic Yoda Fountain at the entrance to the Letterman campus, Johnson describes it as “our signet,” in reference to the icons used by characters in The Mandalorian (2019-23). (Jennifer Foley, manager of Lucasfilm’s Company Store, even ensured that special crew shirts would be available for the team within a matter of days.)

The Rebel Hideout’s emblem (Credit: ILM).

The Circle is Now Complete

Over weekends, late nights, and even the morning of the Hideout’s debut, the team worked tirelessly to complete the room. After many unexpected challenges and a massive amount of work – all accomplished in addition to the team’s regular day jobs – the Hideout’s opening in May 2024 was a resounding success. Since then, it’s hosted not only casual meetings and hangouts for employees, but interviews, portrait shoots, family tours, and parties. “Sometimes parents are more excited to see the Hideout than kids are,” says David with a laugh. “On one tour, we had a group of kids, and they were running all over, pushing all of the buttons. It was a good way of testing out the room’s durability.”


“We don’t want this to feel like a museum where you can look but not touch,” Bateman adds, explaining various updates that continue to be incorporated into the room. “We want visitors to feel like they are actually in Star Wars, and that means allowing them to not only play with the props, but for the props to play back. To that end, we’ve worked together to come up with various Easter eggs throughout the room (like a ‘Red Alert’ siren and ‘Disco Mode’), as well as more involved additions like a ‘Rebel DJ’ activity where guests can use switches on the Millennium Falcon console to compose music.” Updates and revisions have continued, and have even inspired ideas for entirely new themed rooms.

Employees gather during the Hideout’s unveiling in May of 2024 (Credit: ILM).

Channeling the spirit of ILM’s fun-infused, creative culture that has endured for half a century, the Rebel Hideout is a full-circle achievement, maintaining the company’s storied identity in a brand-new way. “‘Ego-less collaboration’ is the phrase we use all the time,” DiComo adds. “It’s who we are. When you have something special like the Hideout, it amplifies the whole feeling. This is why I’ve been here for 30 years – the spirit of these people.”

“This has been like an old-fashioned barn-raising,” Jutan concludes. “Everyone pulls together to create something that’s much more amazing than its individual parts. It’s like every movie and visual effect that we make. One of the first times we toured [Lucasfilm senior vice president of creative innovation, digital production and technology] Rob Bredow through the room, he said that the project was done purely in ILM’s style: If a project is worth doing, it’s worth over-doing.”

(Credit: ILM.com).

Lucas O. Seastrom is the editor of ILM.com, Skysound.com, and a contributing writer and historian for Lucasfilm.

New details from ILM’s new 50th anniversary book written by Ian Failes have been unveiled.


New page spreads from Industrial Light & Magic: 50 Years of Innovation were previewed at today’s Lucasfilm Publishing panel at San Diego Comic-Con. Written by Ian Failes of befores & afters, this book covers ILM’s 50-year story, from its establishment in 1975 to help create Star Wars: A New Hope (1977) to the latest stories and innovations from across the company’s five global studios.

Packed with hundreds of rare behind-the-scenes photographs and archival artwork, 50 Years of Innovation combines ILM’s distinct history of artistic and technical achievement with the inspiring stories of the people who’ve made it all possible. Dozens of both historic and newly-conducted interviews bring rich insight into ILM’s unique process that has shaped the visual effects art form and global filmmaking industry for half a century.


ILM’s story is one of equal parts change and consistency. Through constant evolutions in tools, techniques, and stories, the company’s artists and engineers have maintained their dedication to the highest standards in quality and innovation. 50 Years of Innovation sheds light on the characteristics that have empowered ILM to reach the half-century mark, and that will continue to guide the company into the next 50 years.

Update: Lucasfilm and Abrams Unveil New Spreads

Brand-new page spreads from 50 Years of Innovation have been shared by Lucasfilm and Abrams, providing an even more in-depth preview of the new book by Ian Failes, including sections covering beloved ILM productions like A New Hope, Dragonslayer (1981), E.T. the Extra-Terrestrial (1982), Die Hard II (1990), Terminator 2: Judgment Day (1991), Jurassic Park (1993), Mission: Impossible (1996), Pirates of the Caribbean: Dead Man’s Chest (2006), The Mandalorian (2019-23), Ant-Man and the Wasp: Quantumania (2023), and Indiana Jones and the Dial of Destiny (2023).


Industrial Light & Magic: 50 Years of Innovation arrives in early 2026, and is now available for pre-order from Abrams, Amazon, and Barnes & Noble.

To learn more about this new book directly from its author, check out this story on ILM.com.

To learn more about the Lucasfilm Publishing panel at San Diego Comic-Con, visit StarWars.com.

Watch the ILM.com Newsroom for all the latest news about Industrial Light & Magic: 50 Years of Innovation.

The ILM 50th anniversary logo alongside the cover of new book, Industrial Light & Magic: 50 Years of Innovation by Ian Failes.

Celebrating Ten Years of Immersive Entertainment at ILM

By Amy Richau

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and highlights from Industrial Light & Magic’s 50 years of innovative storytelling.

In immersive stories, the fan is the hero.

Industrial Light & Magic (ILM) has always been at the forefront of innovation, drawing audiences into new worlds by pushing technological boundaries. As part of our special series, “ILM Evolutions,” ILM.com talked with Vicki Dobbs Beck (vice president, immersive content innovation), Julie Peng (director of production), Tim Alexander (visual design director), Ben Snow (senior visual effects supervisor), and Shereif Fattouh (executive producer) about the past, present, and future of ILM’s immersive storytelling.

ILM’s LiveCGX team, including visual effects supervisor Mohen Leo (bottom).

A New Way to Tell Stories

“Let’s invite our fans to step inside our stories in ways that had never before been possible.”

 – Vicki Dobbs Beck

While ILMxLAB was formally established in 2015 to explore the possibilities of immersive storytelling, the seeds of this endeavor actually began much earlier for Vicki Dobbs Beck. In the 1990s, Beck worked at Lucasfilm Learning, where a certain prototype caught her attention. “It was called Paul Parkranger and the Mystery of the Disappearing Ducks,” Beck tells ILM.com. “And what was really cool about all of the projects we were doing at that time is they really did sit at the intersection of storytelling, interactivity, and high-fidelity media – such that it was back then – all through an educational lens.”

Fast-forward to Beck’s time at ILM as head of strategic planning, she started bringing together talent from ILM and LucasArts (now Lucasfilm Games). “It was kind of this little rebel unit that was doing some pioneering R&D [research and development] in high-fidelity, real-time graphics,” says Beck. “Their success gave us confidence that the foundation was in place to build an immersive storytelling studio, expanding on the R&D work done by teams like the Lucasfilm Advanced Development Group (ADG).”

In 2015, ILM and Lucasfilm announced the formation of ILM’s Experience Lab (ILMxLAB) – a new division that would combine the talents of Lucasfilm, ILM, and Skywalker Sound. Lynwen Brennan, then Lucasfilm executive vice president and ILM president, announced that “The combination of ILM, Skywalker Sound, and Lucasfilm’s story group is unique and that creative collaboration will lead to captivating immersive experiences in the Star Wars universe and beyond. ILMxLAB brings together an incredible group of creatives and technologists together to push the boundaries and explore new ways to tell stories. We have a long history of collaborating with the most visionary filmmakers and storytellers, and we look forward to continuing these partnerships in this exciting space.”

The Holocinema team.

From its inception, a studio telling immersive stories with emerging technology platforms made ILMxLAB an appealing destination. Julie Peng, who worked in Lucasfilm Animation as a production manager for projects like Star Wars: The Clone Wars (2008-13) and Strange Magic (2015), was looking to break into the emerging interactive storytelling space when she received a call about a new ILM division that would focus on technology like augmented and virtual reality. “When we started, we were five people,” remembers Peng. “I did my best to take care of any need that arose, big and small, from developing the production infrastructure to writing job descriptions, to ordering pizza and running to the store for batteries. It was about doing whatever was needed to build a team and start exploring what we could bring to the immersive entertainment space.”

In 2016, the studio debuted its first VR experience, Star Wars: Trials on Tatooine, where the Millennium Falcon lands in front of players, and they help R2-D2 and Han Solo with repairs. This was an important step in the studio’s goal of creating a living world. “Everybody was just so blown away by the scale,” says Beck, “because that’s something that VR is so good at – delivering scope and scale.”

Looking to the future was also always a part of the plan. “Because we were so early in the whole immersive storytelling space, we really wanted to help drive the industry,” says Beck. “So we actually very consciously shared our prototypes in public. We spoke about them. We made them available to people because we wanted to actively inspire others to create in this space alongside us.”

Over time, the team evolved into a mix of creatives from the film industry and people with backgrounds in games and interactive development. While bringing in developers with both backgrounds was essential, it also brought challenges for Peng in her role as production manager. “Early on, I realized that they spoke two different dialects,” says Peng. “They used similar terminology, but their approaches to making a creative product were quite different in terms of process and priorities. I found myself becoming a bridge, translating concepts and driving the development of a common language so we could all communicate effectively.”

The team also had to be comfortable with fluidity as the technology they were working with was constantly evolving. Peng noted staying abreast of what was going on in the industry was key, as well as the leadership team being willing to take some risks. “I always call it ‘holding hands and jumping off the cliff together’.”

The First Big Leaps

“It [VR] really is like stepping into a different world, and it feels totally natural once you’re there.”

– Julie Peng

Visual effects supervisor Tim Alexander became involved with ILMxLAB after a history in traditional visual effects, including the 2015 blockbuster Jurassic World. He was also a lifelong gamer intrigued by the work ADG was pioneering at the time: bringing real-time, game engine-type techniques into visual effects. When director Alejandro G. Iñárritu approached ILM about a collaboration on a virtual reality project, Alexander came aboard as visual effects supervisor. The result was CARNE y ARENA, which debuted in 2017.

Still early in ILMxLAB’s history, CARNE was an ambitious project involving a short VR piece bookended by physical experiential rooms that put the audience into a story of immigrants being detained while crossing the border from Mexico to the United States. At the beginning of the experience, participants are brought into a physical holding cell (where the temperature inside is cold) where they have to remove their shoes and items like backpacks. “There are ambient noises and real artifacts like abandoned shoes that have been found in the desert, from people crossing,” notes Alexander.

The key art for CARNE y ARENA.

Participants are fitted with a VR headset and led barefoot into a 50-foot-by-50-foot room full of sand. In the VR portion of the experience, they assume the role of a group of immigrants attempting to cross the U.S.-Mexico border at night when they are stopped by U.S. Border Patrol agents. After the VR story, participants exit and are led down a hallway where video monitors play interviews of the real people CARNE is based on. “He [Iñárritu] cast people that had crossed the border as the people within this experience and wove a story around that, so you actually see the real people and hear their experiences,” says Alexander.

CARNE was a challenge from an artistic and engineering standpoint. What Iñárritu and Alexander wanted to do was sometimes hindered by the current technology. Wanting the images to appear as photoreal as possible, the team realized the immersive film’s computing requirements outweighed what was possible in headsets at the time, so Lutz Latta, ADG graphics engineer, designed a supercomputer with four high-end GPUs (graphics processing unit) to handle work such as calculating shadows in the film.

Other challenges included allowing participants to traverse and turn around in a 50-foot-by-50-foot room. “At the time, there was no way to really run a VR headset over more than 100 feet. You were lucky if you could get five feet away because of the HDMI cables and all kinds of things,” remembers Alexander. VR tracking abilities at the time were also far below where ILM’s engineers wanted them to be. “So then we started mixing in stuff that we know from visual effects of how to track cameras in large spaces. A motion capture stage was built to track the headset instead of what we would usually track the camera in. So it started becoming a mixture of different things that we knew how to do for different reasons, and kind of applying it to this situation.”

A final frame from CARNE y ARENA.

The newness of the technology and the goals the team wanted to achieve with CARNE meant everyone had to adapt and be ready for anything. “It was the first project in my career that I was actually concerned that I would not be able to deliver,” says Peng. “Because in all of my past projects, I would have a production plan, A, B and C, D and E in my back pocket. We were working with new technology and making something that had never been created before. There was no model of how to do that, which made me feel like I was operating without a parachute. That can be very nerve-wracking but also exhilarating when you actually finish the project. That sense of completion and accomplishment was huge.”

Audience reactions to the very visceral experience were all over the map. “We had people that really wanted to get into the middle of it, and they would look at every character and perhaps even jump behind a virtual bush to hide, while others might hang back to observe the scene, whether due to fear or other emotions that came up,” says Alexander. “The overall sense that I got was that people really understood what Alejandro was trying to say,” notes Alexander. “They heard it, and they understood what he was trying to express through that story.”

The studio’s debut with The VOID, Star Wars: Secrets of the Empire, also took place in 2017. At The VOID, up to four fans would suit up with their gear: a VR headset connected to a backpack laptop and haptic vest. From there, teams of fans were immersed in ILM’s digital world; in this case, Secrets connected to Rogue One: A Star Wars Story (2016), giving fans an adventure of a lifetime on Mustafar near Vader’s castle. While infiltrating an Imperial base, they would traverse the facility together and try to recover a key artifact.

Dropping Into the Story

“If we’re creating an experience, we want people to feel like they’re genuinely in a Star Wars project.”

– Ben Snow

Vader Immortal: A Star Wars VR Series

While working on Secrets of the Empire, Ben Snow (visual effects supervisor for Star Wars: Attack of the Clones [2002] and Iron Man [2008]) was recruited to work on a connected story that was in development for home use, Vader Immortal. In the project’s early days, a prototype was put together to see what it was like to be in the same (virtual) space as Darth Vader – spoiler: it’s terrifying. At the same time, Oculus was quietly working on the first Quest headset, revolutionary with its tetherless execution, which ultimately influenced the amount of lightsaber play in the story. The stars and companies aligned, and Oculus Quest became the platform partner for release.

In Vader Immortal, fans take the role of an unnamed pilot who finds themself inside Darth Vader’s castle on Mustafar. The fan’s interactions with Vader were, of course, key to the success of the project. “Mustafar should be scary,” notes Snow. “The confrontation of meeting Vader should be scary. Because that’s what he is.” The Immortal team used scans of Vader’s costume from Rogue One (based on the original 1977’s Star Wars: A New Hope) and built on them with new scans to push the realism even further.

Concept art from Vader Immortal: A Star Wars VR Series by Russell Story.

The team worked internally with Lucasfilm to develop story ideas for three episodes. David S. Goyer, screenwriter of The Dark Knight Rises (2012), wrote scripts around them, injecting his own characters. These were similar to traditional film scripts, which then had to be made more interactive by adding lines of dialogue for the fan to perform certain tasks. The production brought together the film and games world as they put it all together. “In film visual effects, you get a script, you break it down. These are the assets you have to build,” explains Snow. “Interactive entertainment is much more free form and evolutionary. It was an interesting blend between those two mediums.”

The goal with Immortal was always the same: create an experience unique to virtual reality that you couldn’t experience by watching a movie. “One of the things that excited us,” says Snow, “was this was a chance to eavesdrop on Vader a little bit. We had the moment where Vader takes off his helmet, and he’s looking at a memory, almost, of Padmé. You’ve been climbing around, find yourself in Vader’s chamber, and you’re peering through these walls at him. We felt that moment of actually being an interloper, and seeing a side of the character you hadn’t seen before was something that was unique to what we could do in VR.”

Another element distinct to Immortal was making Vader the fan’s own teacher during the experience. The Sith Lord’s introduction is fittingly terrifying for such an iconic character. Initially, when Vader stepped up to the fan, he had a few lines of dialogue. But those lines were eventually cut after some internal tests of the experience. “Vader’s in the distance, and he comes toward you, and you hear the heavy breathing and footfalls,” says Beck, “and he keeps walking toward you, and it becomes more and more intimidating. Almost no one heard the dialogue because you’re so overwhelmed by his presence that it’s all that you can absorb.” Adding to the power of that moment was the eye-tracking in the experience, so no matter the height of the fan, Vader was looking right at you. “And the fact that you’re being acknowledged by a character like Vader is just mind-blowing,” adds Beck.

Actor Maya Rudolph (left) performs the voice ZOE-3 in Vader Immortal as director Ben Snow (middle) and writer and executive producer David S. Goyer look on.

What If…? – An Immersive Story

Shereif Fattouh came to ILM from an AAA games (high-budget, high-profile games from large studios) background, working on titles like Battlefield and Dead Space at Electronic Arts. Interested in story-driven projects, Fattouh worked on The VOID projects Ralph Breaks VR (2018) and Avengers: Damage Control (2019). The development of a new headset, the Apple Vision Pro, led to Fattouh’s involvement in What If…? – An Immersive Story (2024), an experience that uses both mixed reality and virtual reality in addition to hand and eye-tracking through Apple’s innovative headset technology.

Marvel Studios’ What If…? series gave the developers a great amount of freedom in one of the most popular story worlds on the planet – the Marvel Cinematic Universe. “What If…? is such a great vehicle from the comic books and then to the animated show, where you get to just play in a sandbox,” says Fattouh. “What if this happened, and it’s a completely different version of it, and that kind of creative freedom just allowed us to tell the story that we wanted.”

Similar to previous ILM projects,the What If…? team was working on a project without the tech they would need to bring the experience to fans, as the Apple Vision Pro was being created in parallel. “We started development really early on,” says Fattouh. “It was a great collaboration with Marvel Studios, Disney+, and Apple, but we were definitely doing early, early testing and kind of figuring it out as we went.”

A final frame from What If…? – An Immersive Story.

What If…? – An Immersive Story took about 18 months from the conception of this particular idea as an experience to arriving in fans’ hands. Getting there involved finding the balance between the fans watching the story unfold and directly engaging with the characters and environments. “It’s really subjective,” says Fattouh. “There’s no right answer. How much do we want the audience to really observe this amazing story that’s being told and being kind of talked at versus going in and doing things and impacting the narrative? So that was really one of the biggest challenges throughout the whole life cycle. Playtesting it and figuring out, ‘Okay, is it feeling right? Is this beat too long? Is it too short? Do we want to have people jump in and get into the action a little bit faster?’”

During What If…? – An Immersive Story, the Watcher enters the room where a fan is situated. Throughout the story, fans see and interact with versions of some of their favorite Marvel heroes and villains, including Wong, Thanos, Hela, and Wanda. Fans are active participants in the story and get to use iconic items from the Marvel universe, like the Time Stone, to move the story forward.

Fattouh also notes how What If…? gives fans a unique way to experience a familiar Marvel moment near the beginning of the experience. “You don’t really know what’s going on because it starts with a disembodied voice, and you’re in space,” says Fattouh, “and then we kind of kick off in a very Marvel way, where it has that iconic Marvel logo flip book entry. But we did a very 3D spatialized version, where it’s coming into your living room. Just getting to see the smile on people’s faces when they saw something they’ve seen a lot in the films, but to see it really coming out in your living room … it set the right tone of, ‘Oh, this is something different.’”

Marvel director Dave Bushore (center) confers with Immersive crew members during production of What If…?, including: Maya Ramsey, Patrick Conran, Marissa Martinez-Hoadley, Indira Guerrieri, and Joe Ching.

What the Future Holds

After ten years, the team remains small, retaining its nimbleness on a quest for innovative excellence. Working with multiple partner studios and collaborators, the immersive team staggers projects, with typically two in production at a time, and with a production timeline of between 12 to 24 months. “I think over time, our goal will be to expand that capacity and capability,” says Beck. “It might mean expanding it in other studio locations – maybe in London or in Vancouver. The size of the team we have is really nice because everybody knows each other. We can iterate together, and that’s a really important part of interactive, immersive experience development.”

The immersive team has high hopes looking to the future as the technology reaches a wider group of people. “Venues like Star Wars Celebration are always amazing,” says Peng, “because the technology is still growing, and it gives us a chance to share our stories directly with fans. It’s also rewarding to see the accessibility of our experiences making it feel entirely organic and inclusive for everyone.”

Beck looks forward to hands-free AR glasses that can deliver a high-fidelity image with a wide field of view. “We are very excited about this idea of storyliving at city or world scale,” says Beck. “Geo-located content where you could be out in the world in your glasses and little story moments would unfold in the real world.” Beck also sees more people who don’t consider themselves gamers gravitate towards immersive stories. “And I think that’s really great for us because we’re interested in that intersection of story and interactivity and putting you at the center of that experience.”

ILM’s team on What If…? won an Emmy for Outstanding Innovation In Emerging Media Programming. From left: Elizabeth Walker, Ian Bowie, Lutz Latta, Marvel’s Dave Bushore, Vicki Dobbs Beck, Mark Miller, My-Linh Le, Julie Peng, Pat Conran.

Looking ahead, the future of immersive stories is limited only by the imaginations of writers, designers, and engineers devoted to bringing these experiences to audiences. “I think that there’s a huge opportunity for ILM in immersive entertainment broadly defined,” notes Beck. “When we first started, the word “immersive” almost always meant virtual reality, then it included augmented reality, and eventually mixed reality. But now, it’s being used to include linear content or pre-rendered content, but that’s very immersive through screen technology, like Abba Voyage (2022), as an example. The opportunity is to take our talents across the global studios, which include the highest quality visuals and sound, and couple that with the real-time understanding and capability, bringing those things together. I think that we’re going to start to see an increasing desire for interaction, where you are actually in an experience, doing something meaningful that makes the overall experience even more personal. And beginning to understand what that is and taking steps toward a storyliving future. I think that’s the big opportunity for ILM.”

Currently in development in partnership with Meta Quest is Star Wars: Beyond Victory – A Mixed Reality Playset which takes players into the fast-paced, high stakes life of a podracer.

Read more “ILM Evolutions” stories here on ILM.com.


Amy Richau is a freelance writer and editor with a background in film preservation. She’s the author of several pop culture reference books including Star Wars Timelines, LEGO Marvel Visual Dictionary, and Star Wars: The Phantom Menace: A Visual Archive. She is also the founder of the 365 Star Wars Women Project – that includes over 90 interviews with women who have worked on Star Wars productions. Find her on Bluesky or Instagram.

ILM visual effects supervisors Mohen Leo and Scott Pritchard, along with members of their talented crew, discuss the process behind building the TIE Avenger as it journeyed from concept to screen.

By Jay Stobie

(Credit: ILM & Lucasfilm).

For many Star Wars enthusiasts, the word “avenger” conjures up images of Captain Needa’s Imperial Star Destroyer Avenger, the TIE Avenger starfighter featured in Lucasfilm Games’s Star Wars: TIE Fighter (1994) video game, or even Marvel’s prestigious superhero collective. The second season of Andor (2022-2025) has now pushed its own TIE Avenger to the forefront of that list, as the epic series chronicled Cassian Andor’s (Diego Luna) theft of the prototype craft from a Sienar Fleet Systems test facility. Outfitted with advanced armaments and a hyperdrive, the TIE Avenger transported Cassian to Yavin 4 before playing a key role in rescuing Bix Caleen (Adria Arjona) and Wilmon Paak (Muhannad Ben Amor) from Imperial forces on Mina-Rau.

Industrial Light & Magic’s Mohen Leo, whose resume boasts projects like Ant-Man (2015), The Martian (2015), and Rogue One: A Star Wars Story (2016), served as Andor’s production visual effects supervisor for both seasons of the series, while ILM visual effects supervisor Scott Pritchard (Star Wars: The Force Awakens [2015], Avengers: Infinity War [2018], Avengers: Endgame [2019]) oversaw the creative output of the work across ILM’s global studios in London, Vancouver, and Mumbai. Leo and Pritchard gathered alongside CG supervisor Laurent Hugueniot, modeler Owen Rachel, texture artist Emma Ellul, look development artist Renato Suetake, animation supervisor Mathieu Vig, and compositing supervisor Claudio Bassi to chart the TIE Avenger’s course from conceptualization to the completed sequences seen in season two.

Constructing the Concept

The TIE Avenger prototype is first unveiled at the beginning of season two, resting in its Sienar hangar bay before being commandeered by Cassian and embarking on a dramatic escape. “The idea for the opening sequence began with [showrunner] Tony Gilroy wanting to start season two off with a big, classic Star Wars action sequence,” Mohen Leo tells ILM.com. “That initially came out of an outline that Tony gave us in 2022. Early on, a big story point became that Cassian doesn’t know how to fly it, so the Avenger had to have completely unfamiliar controls, and the interior had to look different from any TIE fighter or ship that you’ve ever seen before.”

When it came to the Avenger’s look and layout, Leo worked closely with production designer Luke Hull. “Luke explored various prototype airplanes, and then we played around with the idea of what the ship needed to do in terms of the chase sequence. We wanted something that wasn’t just a dogfight. If he immediately jumps in and it’s just a chase, it’d be difficult to do something original with that,” adds Leo. “Luke had already decided to build a full-sized practical TIE Avenger. As far as its physical construction, the wings were inspired a bit by the TIE interceptor.” While Andor’s Avenger shares a name with the craft from the TIE Fighter game, it maintains its own design lineage. “I don’t think the previous ship was a strong influence,” Leo begins. “When we were designing our Avenger, how the ship functioned became something Luke and I reverse-engineered based on what we wanted the ship to do. That dictated the look.

(Credit: ILM & Lucasfilm).

“I put together a pitch deck for the Avenger before we got into previs when the directors weren’t even on yet,” Leo continues. “I wanted to make sure that the ship and the weaponry, in particular, were based on real weapons and felt both dangerous and aggressive. The team based the TIE’s fold-out Gatling-like cannons on the United States military’s M61 Vulcan rotary cannon. Hull wanted the Sienar base itself to feel like a Skunk Works test facility or NASA’s Jet Propulsion Laboratory.

“It was a back-and-forth between previs and Luke Hull and the art department in terms of the ship’s design,” Leo notes. “We’d say, ‘We need guns that fold out,’ and then Luke would go, ‘Let me see where we can fit those in.’ It was almost as much driven by the necessity of the functionality as it was by the aesthetics.” Once the additional armaments, including external launchers and a powerful cannon below the cockpit, were set, the team pitched the action sequence to Tony Gilroy. “We blocked the whole sequence with a temp model, and Tony generally really liked it.”

Assembling the Avenger

“As the studio-side visual effects supervisor, I was involved in pre-production through to the end,” Leo shares. “I also guided the previs development with The Third Floor’s Jennifer Kitching and collaborated with Luke Hull on how we would make the practical build service what we needed to do in the visual effects sequence.” The creative dialogue between the ILM visual effects team and the production designer was vital in ensuring that the computer graphics (CG) starfighter built by ILM would be identical to the full-sized practical ship that was constructed to be used on the Sienar hangar, Yavin 4, and Mina-Rau sets.

“Because we had a practical version of the Avenger during the shoot, we were able to scan that and provide lots of references,” Leo details. As such, the full-scale Avenger proved beneficial for Owen Rachel, the ILM modeler responsible for building the CG Avenger. “My job was to take [the practical model] and replicate it as a digital version. There wasn’t much design work that we had to do other than replace some structural bits, like the Gatling guns as they come down,” Rachel outlines. “We did have to create the laser cannon that comes out from underneath the cockpit. It’s an intricate design because it’s both delicate and powerful at the same time. It felt a bit like the inside of a watch,” recalls CG supervisor Laurent Hugueniot, who was in charge of the team’s 3D output.

The practical TIE Avenger prop on set at Pinewood Studios (Credit: ILM & Lucasfilm).

Texture artist Emma Ellul tackled the job of texturing the TIE Avenger. “We had great reference images, which was super helpful. I tried to focus on real-life objects, too, such as stealth planes. They’re quite smooth and angular, and I’d see how specularity affects the metal. Not every sheet of metal is made the same, so it has a slightly different bend or warping to it. I incorporated that, especially on the paneling on the outside of the wings,” Ellul relays. “There were a lot of nooks and crannies to look at and a lot of small decals everywhere, which I had to match one-to-one with the practical model. It was an awesome asset to texture. Who doesn’t want to texture a TIE fighter? And then I had to destroy it and chuck laser blasts all over it [laughs].”

Look development on the Avenger was handled by Renato Suetake, who asserts, “As a look dev artist, I get the model and textures so I can put them together and make the shaders. I make sure the shaders and materials react precisely like the reference we have in any situation or lighting condition. Certain shots jumped between the prop filmed on set and the CG version, so the digital Avenger had to be identical. At the same time, because the prop wasn’t made of metal, we still had to make it believable as an actual spaceship that flies.” From the Avenger’s weapons to the hangar explosions to the collapsing ice arch, Leo also credits the effects artists who contributed to the sequence, revealing, “In general, all of those things are just massive, complex simulation work.”

Pairing Computer Graphics and Practical Effects

As Andor’s ILM visual effects supervisor, Scott Pritchard helmed his team at ILM’s London studio, while coordinating the work at ILM’s studios in Vancouver and Mumbai. When it came to the Avenger’s breakout from the Sienar hangar, Pritchard observes that the production sought to use the “best tool for the job,” often pairing ILM’s CG expertise alongside special effects supervisor Luke Murphy’s practical effects. Highlighting a shot where an Imperial range trooper takes aim at the Avenger, Pritchard beams, “There’s a huge staged explosion along the back wall that was done by hanging a line of charges. They explode in sequence, so they explode outwards from the center. That in itself is impressive because it gives us some great visual reference to go on and actual practical elements to incorporate into the final comp.

(Credit: ILM & Lucasfilm).

“There’s a significant amount of work involved in painting out the actual charges and all the little fragments that get blown off properly, as well. It gives you such a great base to work off when you’re putting together a shot like that,” continues Pritchard, who then shifts focus to the Avenger’s weapons blasting through the hangar. “A lot of these explosions are practical, but we’ve enhanced them by adding sparks and additional explosions in CG. Making all this work seamlessly is a great testament to the comp and effects team.”

Compositing supervisor Claudio Bassi concurs, believing that the practical effects supplied ILM with valuable reference for lighting purposes. Bassi also states that, despite the presence of the practical Avenger, the TIE’s wings during Niya’s (Rachelle Diedericks) inspection are actually CG. “The hangar set didn’t have a roof, so we often replaced the wings as it was easier to integrate the CG roof.” Although the hangar set was extremely large, Pritchard highlights the fact that ILM had to extend the hangar to an even greater width, and the entirety of the front section that opens toward the snowy exterior is also CG.

Maintaining a Match

Of course, having both a practical and digital Avenger presented its own challenges when it came to assuring that the details matched, particularly regarding how much damage the CG version sustained at Sienar in comparison to the practical model that was filmed on the Yavin 4 set. “Working with the art department, we knew that the Avenger had gone through the dogfight at Sienar base and should have scorch marks where lasers had hit it,” Leo remembers. “We counted the number of rockets it fired in the first sequence because there’s no way for him to restock in space. We took four specific missiles off the practical build, which we then reversed in digital effects to choose the four that he fires in the opening sequence.”

Hugueniot emphasizes that the same held true for the havoc wrought upon the Sienar hangar itself, commenting, “Not all shots are worked on one after another in story order. There’s a big job of keeping track of what’s going on in every shot. All the scorch marks on the walls, everything that’s been knocked down from the ceiling, and which lights are working in each shot. That was quite a job [laughs].” Bassi agrees, divulging, “We kept track of the marks where the Avenger scratched the floor and which lights broke, and we had a system to recognize them in shot order.”

(Credit: ILM & Lucasfilm).

That level of realism was reflected in animation supervisor Mathieu Vig’s mission to make the Avenger look “heavy and dangerous, and as if it’s made of metal, not just pixels.” Having the Avenger scrape along the deck helped achieve this. “We’re used to seeing them [TIE fighters] flying very gracefully,” Vig explains. “Usually, we don’t animate them bumping around, so setting the weight is harder than you might think. In a hangar setting, there are so many physical elements to consider, such as the actor in the cockpit doing specific movements that we have to take into account. All of this is a carefully interlocking puzzle.”

Diving Into the Details

While analyzing the projectiles under the practical Avenger’s wings to model them for its digital counterpart, Owen Rachel recognized an intriguing connection. “When we were trying to work out how the missiles fire, we realized the design on the set was similar to those seen in Star Wars: A New Hope [1977], as they go into the exhaust port on the Death Star,” conveys Rachel. As it turns out, the design was a one-to-one match, so Rachel subsequently modeled the Avenger’s projectiles after the proton torpedoes that Luke Skywalker (Mark Hamill) fired at the first Death Star. Effort was also invested in preventing the Avenger’s Gatling-style spray of laser fire from appearing as though it simply hovered in mid-air. “We gave them a bit of an offset and some randomness in both their position in the stream and in their x- and y-axis, so we could get more chaos into that stream,” Pritchard elaborates.

Even the red light that flashes along with the hangar’s klaxon alarm was not nearly as simple as one might assume. ILM had to maintain a perfect rhythm between the flashing lights and klaxon, occasionally analyzing where the visual effects team needed to extend or shift the red light so it all remained in sync. “Compositing is basically the last step in the visual effects pipeline that ensures that all elements are integrated and the CG matches with the plate that has been shot on set,” describes Bassi, who worked alongside Pritchard to successfully pitch the idea that the overall light energy of the hangar would get progressively darker and moodier as the Avenger knocked lights off the ceiling.

A live-action plate (top) was captured on the set with practical explosion elements, which were later integrated with ILM’s work (Credit: ILM & Lucasfilm).

Matching the interior views of the practical and CG Avenger cockpits proved to be another challenge. “ILM’s Vancouver studio did a hologram of Cassian in season one that was absolutely fantastic and so lifelike. We reused that for Cassian’s head,” Hugueniot recounts. Filming Diego Luna in the practical cockpit occurred toward the end of the shoot, as Leo clarifies, “That was the very last thing we shot on the whole project—Diego in a motion base cockpit that we could move around and rattle. I think Diego had a really good time shooting those bits [laughs].”

In Awe of the Avenger

With season two now streaming on Disney+, the visual effects team has been able to view the completed episodes and finally share their work on the TIE Avenger prototype with the world. “I couldn’t even tell which Avenger was CG and which wasn’t,” laughs Emma Ellul, referring to the Sienar sequence. “The blend between the hangar, the ship taking off, and the chaos unfolding is so seamless. It was such an exciting bit to watch.”

The audience’s overwhelmingly positive response to the opening scenes has been equally uplifting for ILM, as the sequence fulfilled what the team set out to do. “When we were talking in previs, part of the idea was that it should feel breathless. Every time Cassian has solved one problem, the next problem comes up. There should never be a moment for him to relax until it’s over,” says Mohen Leo, who praises the sound design provided by Skywalker Sound. “One of the things that always happens after we’re done but makes such an impact is the sound. The sound that Skywalker put to the engines, weapons, and all of that makes such a huge difference.”

The TIE Avenger’s action-packed escape consists of a relatively small amount of screen time—but as Leo and his team have outlined, ILM imbued each facet of the Avenger and its accompanying environments with an extraordinary amount of time, energy, and expertise. So, the next time you rewatch Andor, don’t be afraid to press pause amidst the thrilling moments and soak up the wonders that ILM worked to allow the Avenger to ascend to the stars.

Discover more about the visual effects of Andor on ILM.com:

“Like Eating an Elephant One Bite at a Time”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

“Let the Experts Be the Experts”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

Read more about the making of Andor on StarWars.com.

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.













The final part of ILM.com’s discussion with Andor‘s overall visual effects producer and production visual effects supervisor covers the influence of Rogue One, collaborating with Tony Gilroy, their favorite moments from season two, and more.

By Mark Newbold

In part one of our conversation, TJ Falls (vice president of visual effects at Lucasfilm and Andor’s visual effects producer) and Mohen Leo (Andor’s production visual effects supervisor) discussed location shooting and the logistics of bringing Andor (2022-25) to audiences worldwide. Now, we continue our dive into the Emmy-nominated second season and the teamwork required to shepherd the story from the page to the screen.

It takes an army to bring a film or TV series from the imagination of the writers to screens around the world, and that means teamwork is key, as Mohen Leo explains.

“This project was somewhat unique in terms of how collaborative people were. You have certain projects where the director’s attitude is, ‘This is what I want, I don’t care how you do it, just figure it out.’ This was a case of everyone collectively understanding that we were trying to get as much value on screen as possible. That meant I could go to [editor] John Gilroy and say, ‘Hey, that choice you made will cost a lot of money; is it really worth it? It doesn’t feel like this is where we want to put all the effort.’ There were specific instances where he would say, ‘Okay, give me some time, I’ll have a look. If there’s a different way to cut this, I’ll let you know.’ Sometimes he came back and said, ‘Yep, I’ve managed to get rid of the shot and it feels just as good.’ That allows us to take those funds and put them elsewhere to make something else bigger and more exciting.”

Alan Tudyk (top) performs as K-2SO with motion capture (Credit: ILM & Lucasfilm).

Flexibility, trust, and an understanding of team dynamics meant that the Andor team could make required adjustments and pivot, making the most of the skills at hand and sharing the load across departments, something that started at the very beginning of the series.

“That comes from the partnership we had through season one to season two,” TJ Falls notes. “It was an intentionally designed paradigm between [showrunner] Tony Gilroy and our producer, Sanne Wohlenberg. Our brain trust [Tony Gilroy, John Gilroy, Wohlenberg, Falls, Leo, and production designer Luke Hull] was involved in every key decision. As the show moved from start to finish, we were involved in those conversations, so it wasn’t the top brass dictating what the need was; it was a collaboration of ideas to make sure that it was the best version possible for Tony.”

Leo shares an example. “The Yavinian doodar, the creature at the end of episode two [‘Sagrona Teema,’ directed by Ariel Kleiman], that came through the trees, snatched the two rebels, and carried them off into the jungle. There was a lot of handwringing at the beginning because, from a visual effects perspective, you question whether we really want to build a fully computer graphics creature just for one shot. That’s a big ask.

“It’s also in the back of your mind that it’s going to turn into something much bigger,” Leo continues. “Then you try to cover yourself to make sure that it works for all these other things, but throughout, Tony kept saying, ‘I just need the one shot.’ Tony wants this, but we can’t spend too much money on this creature, so how do we make it possible? Ultimately, everyone worked together with the director [Ariel Kleiman], the director of photography [Christophe Nuyens], and the editor [Craig Ferreira] to make it possible to have this creature in there for one shot, and it worked out great. On many other projects, you would have abandoned it because there would have been this fear that it spirals out of control.”

Green screen was utilized at an exterior location in the United Kingdom to portray a view on Coruscant (Credit: ILM & Lucasfilm).

Within the ILM team, there are numerous creatives who have worked on Star Wars projects. Falls previously worked on Star Wars: The Force Awakens (2015), Rogue One: A Star Wars Story (2016), Solo: A Star Wars Story (2018), and Star Wars: The Rise of Skywalker (2019), while Leo worked on Star Wars: The Phantom Menace (1999) and Rogue One (as well as diving into immersive experiences with 2019’s Vader Immortal: A Star Wars VR Series) before arriving at Andor. That familiarity with the galaxy far, far away was a huge boon for the production and the consistency of its tone, as Leo explains.

“In a number of situations, we were the first stop in terms of Star Wars lore, where Tony would ask, ‘How do I do this kind of thing in Star Wars? How does that work in Star Wars?’ We’re able to help there. Same with Luke Hull. However, knowledge of our own world was equally as important for a show as grounded as Andor. For me, a big part of it was using things that were not just from Star Wars, but from other films, documentaries, and news clips.

“There’s a shot in episode three [‘Harvest,’ directed by Kleiman] where we see the troop transport on Mina-Rau and the TIE fighter appears low behind it,” Leo continues. “There’s a fly by as they’re all looking at it. Watching reference of Apache helicopters was one of our inspirations, and I found this incredible shot of a troop transport driving through the desert, and out of the dust cloud, came this helicopter, which goes right by them. I showed it to the director and I said, ‘Can we do this shot?’ And Ariel was like, ‘Oh yeah, absolutely. Let’s do it.’”

Sometimes, as in this case, previsualization from reference material is a huge part of the process, giving form to the action and allowing the production to have a rough version of the episode to build from.

“Tony and the director would quite often call on the visual effects team to pitch ideas for shots,” says Leo, “so Jennifer Kitching, our previs supervisor at The Third Floor, would dig around for reference and try things out. Even on big establishing shots, we were always trying to find something real, so [ILM visual effects supervisor] Scott Pritchard and I identified shots of Manhattan, Tokyo, and Hong Kong, very specific shots. Then we could determine the feel of the shots, but we’re ultimately doing it on Coruscant rather than New York or Hong Kong.”

Palmo City’s central plaza on Ghorman was shot as a backlot set at Pinewood Studios with digital extensions (Credit: ILM & Lucasfilm).

Famously, in the 1970s and 80s, ILM used traditional matte paintings to establish new locations, the principle of which is still at the core of creating an establishing shot. The technique, however, now resides in the digital realm, often with the aid of live-action background plates. Andor treats viewers to a number of establishing shots on Ghorman and Coruscant, a process that takes a considerable amount of time and effort, depending on the requirements of the shot.

“If you’re travelling through the digital location and have a bunch of different angles on it, we will build a full 360-degree environment,” explains Leo. “But if it’s for a single shot, we may do it as a bespoke shot. What worked really well on season one and into season two was that we based things on real cities. You can find open-source 3D street maps of Tokyo or New York, and we would basically fly around and find an angle and think, ‘Okay, that’s a cool angle; this feels organic. Now take that but replace all of the buildings with Coruscant buildings.’ You end up with something that feels organic and real.”

It’s an approach Leo picked up from the director of Rogue One.

“I have to give credit for having learned this approach to Gareth Edwards,” he explains. “When we built the city of Jedha, we had blocks of neighborhoods based on layouts from parts of Morocco. Then, Gareth asked us to do something which seemed really strange at the time. He said, ‘I want you to drop 300 random cameras into the CG model of the city, anywhere you like, and show me the pictures.’ We wrote a script that dropped a camera at every intersection and then rendered a view in every direction. We sent those to Gareth, and he picked from those, the idea being that rather than artificially building a view to the camera, you would scout the artificial city just as you would scout a real location and go, ‘I found this really cool angle here.’ That way, you ended up with compositions that felt much more interesting than if you simply asked someone to put some buildings in the background.”

Elements of the City of Arts and Sciences in Valencia, Spain were utilized for Coruscant (Credit: ILM & Lucasfilm).

Coming from a visual effects background as Edwards does – as most recently evidenced by his work with ILM on 2025’s Jurassic World Rebirth – his fluency in the language of visual effects gave the crew a tremendous advantage on Rogue One.

“What I really appreciate about Gareth is that he has this really disciplined approach to thinking about whether you could have done a visual effects shot in the real world, and would it have felt the same way?” Leo continues. “I took that forward into our approach to Andor. Quite often, if we did a layout of visual effects and everything fit neatly into frame, Gareth would say, ‘That feels artificial. Make it so that something uncomfortably sticks out of frame, and as the shot progresses, I have to pan and tilt from one thing to the other, because both of them won’t fit in the frame at the same time.’ That makes it feel real and organic. I always appreciate how much I learned from Gareth about shot design.”

“That ethos fits really well with the aesthetic that Tony wanted for Andor,” adds Falls. “It was a constant conversation that we would have with our directors and DPs to make sure things weren’t too pretty. Mohen would often say, ‘It’s too clean, how do we make it look not as good, so that it looks even better?’ That was a lot of fun, and it really maintained that beautiful look from Rogue One as part of a storytelling thread that was done visually.”

With Andor now delighting viewers both old and new, the look and feel of the show has become one of its most celebrated talking points. Given that, could – or should – that aesthetic be carried over to the next Star Wars project or not? Mohen Leo has his own thoughts on that.

Andor took inspiration from Rogue One, but Rogue One is primarily a war film, whereas Andor is a spy drama, so Luke Hull made the aesthetic even more grounded. What does this world look like from the perspective of an ordinary person who lives in it? I certainly hope that as we move forward, every project develops its own look. I wouldn’t want everything to look like Andor because that would be boring. The joy that I got out of working on this series is that it proves that you can make something that looks very different but still feels like it belongs in the same galaxy, so I hope that future projects will strike out in different directions and try different things.”

Cassian’s first encounter with the deadly Imperial security droid on Ghorman, later reprogrammed as his companion, K-2SO (Credit: ILM & Lucasfilm).

Understanding that each project needs and deserves its own visual identity within the broader Star Wars galaxy has changed the people who worked on Andor. As he moves on to new projects, TJ Falls reflects on one of the most important lessons he learned from the show.

“It was Tony and Sanne who said, ‘Let the experts be the experts.’ Because of that, every artist felt incredibly valued, and their contribution was fully appreciated. That message was constantly sent down from Tony: Let the best idea win. If an artist had something cool to contribute, it made its way up and it was credited and talked about.”

“The thing that I hope we can take forward is collaboration across departments,” says Leo. “Andor was completely generous in that everyone would happily let someone else do something if it made the result better, so there were no fiefdoms. We were able to put so much value on the screen because every problem was solved together, and that’s something that comes from the top down, in this case, from Tony.

“When there isn’t a sense of shared ownership and a clear creative direction, sometimes the frustration can trickle all the way down through the process,” Leo continues, “not just through the shoot and on the client side, but into the visual effects work and with the artists. Someone can change their mind at any minute and tell you to do something differently. For me, that was the most positive difference about Andor. It’s a culture that says there are no egos; it’s not about anyone standing out and making a name for themselves; it’s all about the collaboration. So I hope that’s something I encounter again on future projects.”

Tudyk performs as K-2SO alongside a final frame created by ILM (Credit: ILM & Lucasfilm).

When pondering each of their favorite moments in season two, Leo is quick to answer, “Cassian stealing the TIE Avenger and escaping in episode one [‘One Year Later,’ directed by Ariel Kleiman] was certainly the one I was involved in the longest, all the way from the beginning pitching storyboards for the action, right down to it being the last thing we shot with Diego.

“What I like about it is that on the one hand it’s a very classic Star Wars sequence with the spaceship and a dogfight,” Leo continues, “but we found a way to still make it fit within Andor by designing it in a way that it starts very practical in a real hangar with a real ship and stunts, and bit by bit we transition into something that’s pure computer graphics, but it all fit into the style the show, so I’m really pleased with that.”

Falls is equally quick to respond. “My favorite is the opening shot from episode eight [‘Who Are You?’ directed by Janus Metz], which is a long lens establishing shot of Ghorman, orbiting around the city. We had Hybride [Ubisoft’s visual effects branch] working on the look of the plaza, and ILM took on the high establishing shots of Ghorman from the air. The shot went through a number of rounds of honing it into what Mohen was thinking, pitching the idea without the visual was tricky, trying to get everyone to understand what it was that was being described. Once we started to get the visuals into motion with previs, it started to click with everybody. Then it came alive in shot production. I think it’s absolutely gorgeous.”

Mark Newbold has contributed to Star Wars Insider magazine since 2006, is a 4-time Star Wars Celebration stage host, avid podcaster, and the Editor-in-Chief of FanthaTracks.com. Online since 1996. You can find this Hoopy frood online @Prefect_Timing.

Read part one of our conversation with TJ Falls and Mohen Leo about Andor on ILM.com.

Read more about Andor on StarWars.com.

In part one of a two-part story, the production’s visual effects producer and visual effects supervisor discuss the effort to create over 4,000 effects shots for the Emmy-nominated Lucasfilm series.

By Mark Newbold

“It was a good opportunity to expand our horizons,” says TJ Falls, vice president of visual effects at Lucasfilm, about the team’s work to create a grounded aesthetic for both seasons of Andor (2022-25). After Rogue One: A Star Wars Story (2016) established the tone for the adventures of Cassian Andor (Diego Luna), the Andor production opted to utilize a number of existing locations for filming in the United Kingdom and around the world. It was a tactic previous Star Wars productions also chose (for example, 1999’s Star Wars: The Phantom Menace traveled to Italy and the Caserta Palace for the interior of the Theed Palace on Naboo), but integrating these locations to such a degree was something new for Industrial Light & Magic, a choice Falls appreciates.

“It allowed us to go out in the world and find a real base reference,” explains Falls, who was also the overall visual effects producer for Andor. “That was something the team worked hard to capture. We’re actually there in the city or in the mountains, so it was wonderful to be able to tie real-world locations into our digital work.”

The debut season of Andor leaned heavily into this physical integration. But, with a very real-world, global pandemic happening around the production, season one had its international travel wings clipped, as Falls explains.

“We couldn’t travel, but we still managed to gather reference material, including some for the ship-breaking yards on Ferrix. For season two, we were fortunate enough to finally be able to travel, so we flew to Lake Como and the Italian Alps to capture plates for Ghorman, among other locations.”

The Mothma estate on Chandrilla utilized aerial plates shot in Spain (Credit: ILM & Lucasfilm).

Joining Falls, production visual effects supervisor Mohen Leo picks up the conversation.

“Being able to travel to Spain for a variety of locations on season two allowed [production designer and executive producer] Luke Hull to rely much more heavily on the look of existing locations that were compatible, particularly the Senate building. Once we did the first location scout at the City of Arts and Sciences in Valencia, we were looking around, thinking, ‘Wow, it looks like Coruscant already.’ That made a huge difference, having that basis, both for interior and exterior spaces, so we could then use visual effects to build on and make it feel like Star Wars.”

The practicalities of having a ready-built set in the form of an existing building clearly had their benefits. Still, the broader task of adding visual effects presented its own challenges, as Leo explains.

“One thing I took away from the project is to push as much as possible for real locations,” he says. “Using an existing building during a shoot allows people to make informed decisions that stick, because if you have something that already looks 50%, 60%, or 70% the way you want it to, everyone has the confidence to say ‘Okay, this is the frame that we want, and we understand that we’re going to put this building in the background. Also, you have the composition of the lighting and the weight of the architecture, which makes it much easier, rather than having a blank canvas in post-production and then debating what it should look like.

“For example,” Leo continues, “there were the mountains around Ghorman. A couple of people from the production team and I went to Italy and did a two-day helicopter shoot. We felt strongly that even those locations where we would never actually shoot with a full crew or with actors should be based very specifically on real landscapes. That allowed us to put the Star Wars architecture in there and have that foundation.”

With the tremendous amount of work required to bring these locations to life, the balance between real locations and visual effects is a delicate one, based on story requirements, budget, and time.

“When we go location scouting, I always ask the director of photography [for season two, Damián García, Christophe Nuyens, and Mark Patten], ‘What are we keeping from the location?’” says Leo. “Because there has to be value in us being there. We were on location in Spain, and a Coruscant scene was discussed, which involved two people standing by a railing, looking out across the fictional cityscape. If we’re going to replace the whole city, then we don’t need to shoot that in Spain.” If you want that view, we can shoot that back in London on a green screen set because it’s easier, and we’ll have more control over the lighting. That, for me, is the main thing, having a clear idea when you go on location of what we keep from the location, and why we are there?”

The original location plate (top) shot at the City of Arts and Sciences in Valencia, Spain opposite the final shot (bottom) with the Coruscant skyline (Credit: ILM & Lucasfilm).

The use of natural light throughout the series is even more impressive when considering the balance between physical structures and digital extensions. Bathing the action in brightness or shadow, regardless of where and how it was shot, Leo explains, is how this integration was managed.

“We work very closely with the DP on that,” says Leo. “There are scenes where people walk directly from a stage set in London onto something that’s on location in Valencia. In the context of the story, it feels like one continuous location, even though they were shot months apart in two different countries. Obviously, we take lots of photographic reference. We have the plates of the one side at hand when we’re doing the other, and we’re constantly checking to make sure things fit together. On this project, we had a plan for each of those things before we went on location and shot it. We’re not trying to force things together in post; they’re meant to go together.”

“That’s exactly it,” adds Falls. “It’s the collaboration with the DP and lighting team, but also with previs, with techvis, and knowing that we’re going from studio space to location space. We had the opportunity to plan that out very specifically, each step of the way. And what helped us succeed is that we had a plan, and we were able to push it through to the best of each department’s abilities to deliver on it.”

Having a plan is essential to any well-run production, and on a visual effects-heavy series like Andor, it’s even more vital. Managing the process requires unique skills and systems to marshal all the information and elements into one place, as Falls explains.

“You’ve got to manage all these people and figure out who’s doing what, breaking it down to what the responsibilities of each person are. You start with something that’s massive, and we start to split things up between our teams and vendors. Ghorman is primarily a Hybride sequence; we’ve got Scanline VFX dealing with Mina-Rau, and we work with [ILM visual effects supervisor] Scott Pritchard to ask how we’re going to slice up this pie.

“It’s like eating an elephant one bite at a time,” Falls adds with a smile. “That translates from the production side into post and dealing with our vendors, and it’s all about clear communication, having people that you can build a shorthand with and have trust with, and then let them do what they do and not overmanage it.”

Actor Joplin Sibtain (Brasso) atop the speeder prop rigged to a camera vehicle (top) with a final frame from Mina-Rau (below) (Credit: ILM & Lucasfilm).

Truly a mammoth task, but that’s just the start of it. “Then, each individual team brings their expertise to build it right back up the mountain,” Falls continues, “so that Mohen has the opportunity to have that creative outlook over everything, I make sure it’s moving at the pace that it’s supposed to and that we’re hitting our schedule and staying on budget while making sure that [creator and showrunner] Tony Gilroy is getting what he wants for his vision of the show.”

There are many unsung heroes on any production, and amongst those are the production managers (including Frédérique Dupuis and Alyssa Cabaltera from ILM and Anina Walas from Lucasfilm, among others), who, on the visual effects team, juggle countless shots and give structure to the process for both the production and the partner studios. In its completed form, Andor might appear to be a graceful swan, but under the water’s surface, its legs are furiously kicking to propel it forward, as Mohen Leo elaborates.

“The visual effects production team has to keep track of over 4,000 shots, and each one of those shots has dozens and dozens of assets, be it art and reference or photography and scans, and they have to funnel all of that to where it needs to land and then send any questions back to me in a manageable way. I answer the creative questions. The logistical and organizational work is done by a team of incredibly diligent people without whom none of this would be possible.”

Along with this beehive of activity tracking all the elements, a database system, unique to each production, needs to be put in place.

“We find on each show that you have similar tool sets and similar ways of databasing things,” Falls says, “but you have to build it around the specific challenge of the show and the personalities involved. It’s about what Mohen likes and the types of data that we’re getting in.

“You have people like [on-set visual effects supervisor] Marcus Dryden, who was on set managing that side of things. His role was specific to season two, and it worked really well, that marriage of supervision responsibilities between me and our Lucasfilm production team and our production manager, and the coordinators building the database. That worked well for Mohen to get the notes in and out and track the scans and the data, but presenting it in forms that fit the specific way we were working with our vendors on this show. It wasn’t groundbreaking, but it was specific to what we needed.”

Palmo City’s central plaza on Ghorman utilized the massive backlot Pinewood Studios (top), and was later completed with visual effects (bottom) (Credit: ILM & Lucasfilm).

The database is set up, a system is in place, production managers have a process, and the elements are tracked as they come in. “It’s absolutely critical because it gives me the luxury to say, ‘Hey, where’s that scan from that location that we shot in that scene six months ago in Valencia?’” explains Falls. “And within 10 seconds, somebody will go, ‘Here it is.’ That shouldn’t be taken for granted because I’ve been on many shows where that can turn into an archaeological dig that can take days, or sometimes you don’t find it at all.”

With this bespoke Andor structure in place for season one, Leo could then take that and refine it even further for season two, a huge advantage, especially considering episodic television wasn’t a familiar environment for him.

“Season one was a big learning experience,” explains Leo. “I’d never done episodic television before; I’d only done movies, so dealing with that much content in such a compressed time was challenging. Also, the interaction with editorial is slightly different on episodic television. With every project, there’s an element of adjustment, but, there’s also an element of learning.”

“We had the luxury of a number of production staff carrying over from season one to season two,” says Falls. “So we learned in real time and adjusted things to fit. You could port it, but it wouldn’t necessarily work as succinctly as it does when it’s crafted around the group, and for season two in particular, I felt that we ended up crafting a really great system. The team was unbelievably adept in making sure that every person got exactly what they needed as quickly as humanly possible.”

The script is the tramline for everything that ends up on-screen, but in the realm of visual effects and working with the rest of the crew, there needs to be a clear understanding of what’s required and how to do it, something that comes from the top, as Leo explains.

“When we’re planning a shoot, we sit down with the director, the cinematographer, and the assistant director and ask, ‘What are you trying to achieve, what do we need to contribute in terms of the visual effects, and how do we make sure we get what we need during the shoot?’ Then we take meticulous notes.” 

However, it doesn’t always go as smoothly as planned. “We’re staring at the monitor as they’re shooting, but then somebody drops the microphone into frame, so that’s something we have to paint out,” Leo continues. “Maybe we have to do a set extension that we didn’t expect. Then there’s a step in post-production where, along with editorial, we’re looking at the early versions of the cuts, and that’s where we do something called the statement of work, where we look at each individual shot and go, ‘Okay, here’s all of the things we need to do for this particular shot across the various disciplines in order to complete it.”

An aerial view of the Ghorman set on the backlot (top) and final frame (bottom) (Credit: ILM & Lucasfilm).

Like all aspects of a production, visual effects come at a cost, with so many highly skilled experts putting their time and craft into a project. The team is responsible for both managing costs and ensuring that additional required effects can be covered within the allotted budget.

“There’s a constant ebb and flow of evaluation, so we work closely with editorial, seeing the working cuts,” Falls notes. “We go in with [editor] John Gilroy and they show us little pieces, and that allows the opportunity for some give and take as we evaluate things and look at shots and go, ‘Well, this is more than we had planned, or maybe there’s another sequence where they’re using less than what we had planned,’ and so there’s a little bit of horse-trading that happens.

“What we strive for,” Falls continues, “is to not say we can’t do something because it wasn’t planned. If there are 10 additional seconds needed in the show, how can we do it? Can we find a way that still delivers everything that’s needed, but also in line with the number of resources we allotted? Then, we’re back on budget, or I have to figure out how to take care of it, but we always start with what is the creative desire for the scene. How is it furthering the story? We don’t want anything that’s egregious or over the top just for the sake of being something flashy, so we have to make sure that everybody is in agreement that ‘Okay, it’s more than expected but it serves the story, it does what Tony needs, and now it’s our job to figure out how can we make it work.’ I think we did a pretty good job of that.”

Join us as we continue our conversation with TJ Falls and Mohen Leo to delve into the logistics of making Andor, the teamwork required to bring Cassian’s world to the screen, and their favorite moments from the second season.

Mark Newbold has contributed to Star Wars Insider magazine since 2006, is a 4-time Star Wars Celebration stage host, avid podcaster, and the Editor-in-Chief of FanthaTracks.com. Online since 1996. You can find this Hoopy frood online @Prefect_Timing.

SDCC’s hottest ticket brought the first-time Comic-Con guest and friends to preview the Lucas Museum of Narrative Art, and ILM.com was there in the room.

By Clayton Sandell

George Lucas takes the stage at San Diego Comic-Con (Credit: Lucasfilm).

Star Wars creator and Industrial Light & Magic founder George Lucas recently made his San Diego Comic-Con debut, but the Force has been strong at the show for decades.

Inside the convention center’s massive Hall H, a record Sunday crowd of 6,500 screaming and cheering fans greeted Lucas as he walked onstage to give the first public preview of the Lucas Museum of Narrative Art.

Co-founded by Lucas and his spouse, Mellody Hobson, the museum is set to open in Los Angeles in 2026. Lucas describes the building as a “temple to the people’s art.”

“This museum is dedicated to the idea that stories and mythology are extremely important to society in creating community,” Lucas told the crowd. “Art illustrates that story.

“It’s mythology,” he continued. “People believe it, and it binds them together with a common belief system. And what we’re doing here with the museum is to try to make people aware of the mythology that we live by. And at the same time, let them have an emotional experience looking at art.”

Lucas was joined by two Academy Award-winning filmmakers: director Guillermo del Toro and Lucasfilm’s senior vice president and executive design director Doug Chiang. Actor and artist Queen Latifah moderated the panel.

“What is amazing about this collection is that it will give you a step-by-step look at how a form of expression came to inform what we are today,” said del Toro, a Lucas Museum board member and longtime ILM collaborator on films including Pacific Rim (2013) and the upcoming Frankenstein (2025).

Director Guillermo del Toro (left) at ILM’s San Francisco studio during work on Pacific Rim (2013) with visual effects art director Alex Jaeger (center) and visual effects supervisor John Knoll (Credit: ILM & Greg Grusby).

The Lucas Museum’s renowned collection includes items from both the original and prequel trilogy eras of Star Wars, including filming miniatures created by the ILM Model Shop, concept art, creature maquettes, costumes, a full-scale version of Anakin Skywalker’s N-1 starfighter from Star Wars: The Phantom Menace (1999), speeder bikes from Star Wars: Return of the Jedi (1983), and Luke Skywalker’s X-34 landspeeder from Star Wars: A New Hope (1977).

The pieces will share exhibit space with an eclectic mix of visual art: paintings by artists including Norman Rockwell, Frida Kahlo, and Maxfield Parrish; original art created for Iron Man’s first comic cover in 1968; the first-ever 1934 Flash Gordon comic strip drawing; and Peanuts illustrations drawn by Charles M. Schulz.

“These are all very emotional pieces,” said del Toro. “This is celebrating things that speak to all of us, collectively or individually.”

Lucas says his art collecting began in college when he bought his first comic illustrations. His multifaceted collection today has grown to around 40,000 items.

“I’ve been doing this for 50 years now. And then it occurred to me: ‘What am I going to do with it all? Because I refuse to sell it,’” Lucas explained. “I said I could never do that. It’s just it’s not what I think art is. I think it’s more about an emotional connection with the work.”

Lucas’s first appearance at San Diego Comic-Con brings an association that began a long time ago full circle. In 1976 – ten months before his space fantasy adventure Star Wars hit theaters – a few dozen lucky attendees got a preview of the comic book adaptation of the film led by Marvel’s Roy Thomas and Howard Chaykin. They were joined by Charles Lippincott, Lucasfilm’s vice president of advertising, publicity, promotion, and merchandising. During a panel that didn’t start until 8 p.m. on Thursday, July 22, the trio also revealed a few still images from the upcoming movie to a room that had plenty of empty seats.

Visual effects art director Doug Chiang at work on Terminator 2: Judgment Day (1991) (Credit: ILM).

Chiang, a celebrated artist himself who first joined Lucasfilm as creative director at ILM in 1991, described growing up loving comic books at a time when comic book art didn’t get much respect.

“I think what’s remarkable about George is that he leads from the heart, and this museum is him. It’s his gift to help celebrate this,” said Chiang. “Narrative art is a way to educate kids and say, ‘It’s okay to draw your fantasy, draw things from your mind, embrace comic books.’ It shouldn’t be left out of art. What’s fantastic is that I think the museum will inspire the next Norman Rockwell or Frank Frazetta.”

Clayton Sandell is a Star Wars author and enthusiast, TV storyteller, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell) Bluesky (@claytonsandell.com) or X (@Clayton_Sandell).

Continuing a new series celebrating ILM’s 50-year legacy, featuring new interviews with ILM animation supervisors Rob Coleman, Mathieu Vig, and Stephen King.

By Jamie Benning

Ultraman and Nemi (Credit: Tsuburaya Productions & Netflix).

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and examples from Industrial Light & Magic’s 50 years of innovative storytelling. Read part one of this story here.

After Rango (2011), ILM continued to focus on photoreal visual effects work, but the idea of returning to feature animation remained alive in the background. The ambition had been there for some time.

“Jim Morris [former ILM president] was always pushing for ILM to do more feature animation,” explains Rob Coleman, creative director and animation supervisor at ILM’s Sydney studio. “I remember going on senior staff retreats for years, and every year he brought it up that that was a goal for him.”

At one stage during the early 2000s, an animated Frankenstein film was in development, though it never reached production. Despite that momentum, feature animation remained secondary to ILM’s core live-action visual effects business.

When Disney acquired Lucasfilm in 2012, ILM found itself part of a larger family including not just Lucasfilm Animation, but also two giants of feature animation – Walt Disney Animation Studios and Pixar, the latter an outgrowth of a former Lucasfilm division. With such formidable in-house animation studios under the same corporate umbrella, the idea of ILM producing its own fully-animated features inevitably became more complex. For the time being, ILM leaned into its core strength: pioneering visual effects work that has long been integral to live-action storytelling.But then…“People weren’t shooting movies,” Coleman recalls. “The pandemic opened a door.” That led to renewed interest in feature animation from partner film studios. Soon, both Ultraman: Rising (2024) and Transformers One (2024) were underway.

A Return to Feature Animation with ‘Ultraman: Rising’

For decades, ILM had been at the forefront of visual-effects-based animation, but Ultraman: Rising marked a shift – embracing stylization while maintaining strong, character-driven storytelling.

Animation supervisor Mathieu Vig notes the challenge of moving from photorealistic creatures to a more expressive, feature animation style. “That was a very interesting challenge,” he tells ILM.com. “First of all, because many were eager to go back to feature animation. But a lot of people had never worked in feature animation, me included. So that was definitely a bit of a scary enterprise after all of these photoreal creatures and characters.”

Many of the animators came from big, effects-heavy projects and initially expected Ultraman to follow suit. “I think we were all expecting the movie to be about that. And we were ready for it. Then we realized it was not about that at all,” says Vig.

Meeting directors Shannon Tindle and John Aoshima helped align the team with the film’s more emotional and grounded tone. “They put me at ease very quickly,” notes Vig. “Because I realized how caring and how clear they were about what they wanted from me as an animation supervisor. They wanted to meet everybody. To talk to the team. They were both so clear and detailed. That way, we could focus on – does the animation feel true? Does it feel rehearsed or active?”

The directors emphasized performance-based animation first and foremost, even referencing unexpected inspirations like Kramer vs. Kramer (1979) to highlight the film’s emotional depth. “Despite the kaiju-sized spectacle, Ultraman: Rising wasn’t just about action,” Vig explains. “It was a story about family, identity, and connection. We wanted and needed to have believable characters, quite subtle acting. We wanted an interesting mix of something that looks stylized but at the same time has so much heart and groundedness. The animation reviews were always about character development. There was great trust on both sides.”

Ultraman does battle with Gigantron (Credit: Tsuburaya Productions & Netflix).

One of the defining aspects of the animation ethos is attention to imperfection – the small hesitations, twitches, and unplanned gestures that make performances feel real. “We always wanted to sneak in as much as possible. A little dirt, little accidents, a little hesitation when you grab something, scratching yourself when you’re confused,” Vig says. “Sometimes it was just a little bit too clean, a little bit too perfect. And we said, ‘Here you can add some very fine little moments where you can break the perfect choreography.’” Even quiet, dialogue-driven moments are given space to breathe.

“There’s one shot in particular that I really love,” he continues, “which is when Ken and Ami are talking in the restaurant and eating the curry. One-minute shots of Ken, explaining his life to Ami, and Ami listening. And again, nothing happens, but I remember seeing the first blocking of this shot. I was kind of mesmerized by how beautifully ‘nothing happening’ was done. Obviously, it’s not ‘nothing.’ There was a story behind it, but to make that moment engrossing and entertaining was quite something.”

This drive for grounded performance often meant starting from realism, then dialing it back into a stylized world. It became a creative muscle that benefited both the film and the artists.

“We always started with realistic acting and then tried to bring it back down to a feature animation, Ultraman style,” adds Vig. “If the whole team were a classically trained feature animation team, we would have probably worked in the opposite way. I think it’s a very good exercise, and it totally benefits us for future work in visual effects realism because we all went through this process of filtering the shot back to its essence, rather than saying, ‘I’m just going to fill it up with animation.’ We’ve been spoiled. I hope we can be spoiled again. Whether it’s robots, giant kaijus, whatever else, if you have these living, breathing characters, we can do them at ILM. And we’d all love to do more.” Ultraman: Rising wasn’t just a return to feature animation for ILM – it was a chance to apply decades of performance-focused visual effects expertise to a new kind of storytelling, and to remind themselves, and audiences, what’s possible when stylization and sincerity meet on screen.

Building an Animated Cybertron: ‘Transformers One’

For Rob Coleman, Transformers One marked both a creative opportunity and a personal return. Having previously worked as animation director on Happy Feet Two (2011) and as head of animation at Animal Logic for The LEGO Movie (2014), he was drawn back to ILM by a renewed promise: that the studio would once again pursue full-length animated storytelling alongside its groundbreaking visual effects work. “ILM was going to be doing animated features as well as visual effects,” he explains to ILM.com. “That’s what enticed me back.”

Unlike the live-action Transformers films, which blended human characters with visual effects, Transformers One is set entirely on Cybertron. The film focuses on the emotional backstory of two iconic characters, in a world without any human frame of reference.

“Director Josh Cooley made it clear from the beginning – this wasn’t part of the Michael Bay universe,” Coleman said. “It was an origin story about two friends, basically brothers, who, because of life decisions, end up on very different paths.”

A group of Autobots (Credit: Paramount).

This character-driven approach brought performance to the forefront of the animation process. ILM animation supervisor Stephen King emphasizes the importance of expressing emotional depth without relying solely on dialogue. “It was essential to Josh that the subtlety and the nonverbal acting was just as important as what they were talking about in the dialogue,” King tells ILM.com. “In order for an audience to connect to an animated character, you have to bring them to life and make the audience believe that they’re thinking.”

That philosophy extended to every aspect of the film’s design and animation style. For Coleman, making the robots believable also meant starting with their inner life, not just their external mechanics. “It was key that the audience think they were looking at sentient robots,” he notes. “We always thought about the life spark inside – the character’s soul.”

To support this, ILM developed new tools and techniques. Their facial animation system was rebuilt from the ground up, allowing animators more expressive control while maintaining the precision required for robotic characters. “We really tried to get the facial performance to be as emotional and realistic as possible,” King says, “but then going, well, how can we make it robotic? We added these little robotic movements into the eyes and treated them like camera apertures and shutters.“By rebuilding the facial system, it gave animators a lot more freedom to move things around,” he adds. “Transformers One was all keyframe animated. For character performances, that’s where I want to be.”

Cooley’s background at Pixar helped shape the film’s animation language, particularly in its reliance on visual storytelling and expressive silence. “Very quickly we talked about non-verbal performances, the importance of eye animation, and his desire to play the whole third act, at least in test screenings, with no sound, completely in pantomime,” Coleman recalls. “I was like, yes, yes, and yes. Okay, you and I are going to get along just fine.”

The choice to exclude human characters offered an unexpected advantage: Without the need to establish scale or interaction with live-action actors, the animators were free to define their own physical rules for the world of Cybertron.

Optimus Prime (Credit: Paramount).

“Not having humans in our movie actually was a great plus for us,” King says. “The Transformers being 24 feet tall doesn’t mean anything to the characters, because that’s just how tall they are. That’s the world that they live in.”

To make the robotic characters feel nuanced and alive, the animation team relied heavily on physical reference. The animators themselves brought an additional layer of ownership to each shot.

“One of the great things about the movie is that all the reference was done by the animators themselves,” King explains. “Every shot, animators would act themselves or they’d get someone else to act out for them – and they would be able to put those performances into the character.”Even the mechanics of transformation – an iconic feature of the franchise – were reimagined through the lens of character logic and day-to-day function. “It was important to the director that this is like breathing for them – this is part of their day-to-day life,” says King. “So, we don’t need a five-second transformation every time. It’s what’s efficient for them, like getting on with their day.”

The result is a film that struck a chord with both critics and fans. Reviewers praised Transformers One for its emotional depth, strong character focus, and thoughtful storytelling, a refreshing change of pace for the franchise. Audiences responded just as warmly, celebrating its mix of high-octane action, humor, and heart. It is a reminder that even in a universe of sentient robots and shifting metal, the most powerful transformations happen within.

The Future of ILM Animation

With Transformers One and Ultraman: Rising showcasing ILM’s renewed investment in feature animation, the studio is now well-positioned to explore new creative territory. “There’s great interest,” Coleman says. “We’re just waiting for the right projects to land and get green-lit, but there’s certainly an appetite.”

“What this year [2024] has done with Ultraman and Transformers has really put ILM at the forefront of people’s minds,” King adds. “They’re calling cards to creators to say, ‘We can do whatever you want.’”

For Vig, the excitement lies in ILM’s ability to blend visual effects expertise with expressive storytelling. “Whether these guys are robots, giant kaiju, or something else, at the heart, if you have well-rounded, breathing characters, we can do them. And we’d all love to do more of it.”

From stop-motion animated creatures to fully animated features, Industrial Light & Magic’s journey has been one of constant reinvention and evolution. With its expanding tool kit and growing focus on animated storytelling, the studio’s influence is set to shape the next era of animation and visual effects.

ILM’s legacy in animation is secure, built on decades of innovation, artistry, and risk-taking. But the next chapter in animated storytelling is already underway, evolving frame by frame.

Learn more about the creation of Ultraman: Rising and Transformers One on Lighter Darker: The ILM Podcast.

Read more about Ultraman: Rising on ILM.com.

Check out Transformers One concept art from the ILM Art Department on ILM.com.

Read more stories from our 50th anniversary series, “ILM Evolutions”:

ILM Evolutions: Animation, From Rotoscoping to ‘Rango’

ILM Evolutions: Pushing the Boundaries of Interactive Experiences

Jamie Benning is a filmmaker, author, and podcaster with a lifelong passion for sci-fi and fantasy cinema. He hosts The Filmumentaries Podcast, featuring twice-monthly interviews with behind-the-scenes artists. Visit Filmumentaries.com or find him on X (@jamieswb) and @filmumentaries on Threads, Instagram, and Facebook.

The beloved Disney character can now interact with fans in a whole new way.

By Patrick Doyle

When Lilo & Stitch (2025) returned to the spotlight with its recent live action release, fans were treated to more than just a nostalgic trip to Hawaii, they got to interact with Stitch himself in real life.

In a groundbreaking collaboration between Industrial Light & Magic, the Walt Disney Studios, and Skywalker Sound, audiences around the world experienced a completely new way to interact with Stitch. Fans can now go behind the scenes of this unforgettable moment with a newly released making-of video that showcases the magic behind the real-time Stitch activation.

Making Magic in Real Time

“This wasn’t just about showcasing technology,” said Alyssa Finley, executive producer of the real-time Stitch experience at ILM. “It was about deepening the connection between fans and a character they love. Seeing people dance with Stitch and ask him questions live was pure joy.”

From the moment real-time Stitch hit Disney Studios’ TikTok and Instagram accounts, it was clear something special was happening. In the days leading up to the premiere, Stitch engaged with fans in real-time, offering shoutouts, surprise cameos, and plenty of chaotic dance-offs that made waves across social media.

But the fun didn’t stop there. Stitch also made an appearance at the film’s press junket, chatting (yes, chatting!) with reporters from around the globe and generating viral clips that quickly spread online.

A Blue Carpet Experience to Remember

At the Lilo & Stitch premiere in Los Angeles, fans and celebrities alike had the chance to interact with Stitch live. Whether it was asking him questions, dancing together, or witnessing his signature mischief, the experience felt spontaneous, playful and, most importantly, real.

All of this was powered by ILM’s cutting-edge performance capture and real-time animation pipeline, seamlessly integrated with support from Disney Studios and the brilliant audio minds at Skywalker Sound.

“Stitch has always held a special place in the hearts of fans around the world,” said Jason Eskin, vice president of marketing at Disney Studios. “Seeing fans light up when Stitch talked to them in real life was a reminder of why we create these moments. It was truly Disney magic, made possible through ILM’s incredible innovation.”

Bring Stitch Home

Lilo & Stitch is now available on digital and will be released on Blu-ray on Aug. 26. Whether you’re revisiting the story or experiencing it for the first time, there’s never been a better moment to fall in love with Stitch all over again.

Patrick Doyle is a senior publicity manager at Industrial Light & Magic.

Teams from across ILM’s global studios are recognized for their innovative work for television this past year.

The 77th Primetime Emmy Award nominations were announced this week, and Industrial Light & Magic artists have earned three of them. 

Among the 14 nominations for the second season of Lucasfilm’s Andor series on Disney+, ILM and their collaborators have earned one for “Special Visual Effects in a Season or a Movie.” The nominees include production visual effects supervisor Mohen Leo, visual effects producer TJ Falls, special effects supervisor Luke Murphy, creature effects and droid supervisor Neal Scanlan, ILM visual effects supervisor Scott Pritchard, Hybride visual effects supervisor Joseph Kasparian, Scanline visual effects supervisor Sue Row, MidasVFX visual effects supervisor Paolo D’Arco, and digital colorist Jean-Clément Soret.

Alongside the Andor nomination for “Special Visual Effects in a Season or a Movie” is season two of The Lord of the Rings: The Rings of Power on Amazon Prime. The nominees among ILM artists and their partners include production visual effects supervisor Jason Smith, visual effects producer Tim Keene, visual effects producer Ann Podlozny, visual effects co-producer James Yeoman, ILM visual effects supervisor Daniele Bigi, DNEG visual effects supervisor Greg Butler, Rodeo FX visual effects supervisor Ara Khanikian, The Yard visual effects supervisor Laurens Ehrmann, and special effects supervisor Ryan Conder.

The Balrog in Rings of Power Season 2.

Earning a nomination for “Special Visual Effects in a Single Episode” is the premiere entry from season two of the Apple TV+ series Severance, “Hello, Ms. Cobel.” The nominated ILM artists and their collaborators include production visual effects supervisor Eric Leven, production visual effects producer Sean Findley, ILM visual effects associate supervisor Shawn Hillier, ILM visual effects associate supervisor Radost Ridlen, ILM environments lead Martin Kolejak, ILM producer Brian Holligan, ESE visual effects supervisor Alex Lemke, ESE visual effects supervisor Michael Huber, and on-set visual effects supervisor Djuna Wahlrab.

Congratulations to all of our ILM nominees!

The 77th Emmy Awards air on September 14, 2025 at 5 PM PST on CBS and Paramount+.

New series exploring ILM’s 50-year legacy kicks off with new interviews featuring original Star Wars animator Chris Cassidy and current ILM animation supervisors Rob Coleman and Hal Hickel.

By Jamie Benning

George Lucas reviews a visual effects shot with ILM crew during production of Star Wars: A New Hope.
From left: animator Peter Kuran, production coordinator Rose Duignan, director George Lucas, animation and rotoscope supervisor Adam Beckett during production of Star Wars: A New Hope (1977) (Credit: ILM & Lucasfilm).

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and highlights from Industrial Light & Magic’s first 50 years of innovative storytelling.

Animation has been woven into the DNA of Industrial Light & Magic’s story since its earliest days. From utilizing legacy techniques in Star Wars: A New Hope (1977) to the groundbreaking blend of live-action and animation in Who Framed Roger Rabbit (1988), ILM has continually redefined the possibilities of visual storytelling.

In this two-part article, we explore ILM’s journey from early work with rotoscoping, stop-motion, and go-motion to the development of sophisticated digital character animation in Jurassic Park (1993), the Star Wars prequel trilogy, and beyond. Part one focuses on the key innovations that culminated in Rango (2011), ILM’s first fully animated feature film. Part two examines how the studio expanded on these foundations in Transformers One and Ultraman: Rising (both 2024), solidifying its role as a leader not only in visual effects but also in feature animation.

Early Innovations and Handcrafted Beginnings

In 1975, as Star Wars, later retitled Star Wars: A New Hope, entered production, Industrial Light & Magic was a fledgling outfit assembled to help realize George Lucas’s ambitious vision. Animation quickly proved essential to the storytelling – Lucas’s needs were varied, including spaceship models firing laser bolts, glowing lightsaber blades, a holographic chess game, and stylized targeting displays.

To create the signature blaster bolts, California Institute of the Arts graduate Adam Beckett was hired in July 1975 to lead a small team in creating the animation and rotoscoping – including a young Peter Kuran. “I was initially shooting wedges and different colors for the laser beams and stuff like that. I was learning to use the equipment. We all were,” Kuran told The Filmumentaries Podcast.

“I actually did the first perspective beams,” said Kuran. “What was being tested was just kind of like back and forth – no perspective on it. I had suggested that we try that, and I actually got a very chilly response. So I decided to stay late one night and do a test and took it to the lab myself. It ran as a daily the next day, and [visual effects supervisor] John Dykstra liked it, so I wound up being the chief of that, at least for the time being.”

The iconic lightsaber effects were outsourced to Van Der Veer Photo Effects for the first film but later brought in-house at ILM. The process began by generating mattes from the live-action prop blades. Early experiments with retroreflective material and spinning poles proved too complex and were eventually streamlined. The mattes were rephotographed and colored frame by frame, with hues used to help audiences distinguish between each character’s weapon – blue for Obi-Wan Kenobi, red for Darth Vader – setting the look for the Star Wars saga for decades to come.

Lightsabers were created with hand-drawn animation in the original Star Wars trilogy, as seen here with Obi-Wan Kenobi (right, Alec Guinness) and Darth Vader (Bob Anderson/James Earl Jones) in Star Wars: A New Hope (Credit: ILM & Lucasfilm).

“At first, ILM didn’t have the resources to do all the opticals themselves,” animator Chris Casady tells ILM.com. “They sent shots out to Van Der Veer, Cinema Research, and Modern Film Effects. Those places were the old guard – they’d done work on Logan’s Run (1976), Soylent Green (1973), that kind of thing.

“But the goal was always to bring everything in-house,” Casady adds. “And once ILM got the optical department up and running in Van Nuys, the quality jumped. We had more control, and it just looked better.”

Beckett, as described by Casady, “was without a doubt a genius. Adam was extremely brilliant. He wanted to be able to put some of his psychedelic style into Star Wars. He thought it was almost an obligation to one-up 2001: A Space Odyssey [1968]. But Lucas wanted something more realistic.”

Casady noted Beckett’s work on the Death Star superlaser charging sequence, explaining that “Adam did a tremendous amount of work putting together that Death Star laser tunnel shot – all those rings and green things flashing down the middle. It’s built up of multiple passes, multiple exposures, multiple pieces of artwork.” The platform on which the live-action actors were standing was completely hand-drawn by Peter Kuran.

Casady added that “Adam’s signature work is the electrocution of R2-D2,” an entirely hand-drawn effect requiring precision to make the electricity feel convincing on screen.

“I really was brought in at a grunt level to make garbage mattes on the animation stand at night to free up the VistaVision cameras in the daytime,” Casady explained. “Every time they filmed the spaceship on stage … everything outside the blue is considered garbage; it’s got to be masked out. So, my job was to make this matte and block out the garbage.

“On film, my mattes fell below the threshold of black, so it became black,” Casady continues. “Famously, when the film was first released on VHS … my mattes were visible in the negative. … The audience saw my garbage mattes as irregular shapes that jumped every six or eight frames. So that’s the only time people got to see my work on the film!”The animation team also solved another subtle but crucial challenge: making the miniature spaceship models feel more plausible in their scenes.

“There was a shot of a TIE Fighter flying past the camera, and they were concerned it looked too flat,” said Casady. “So they asked if we could paint in some reflections – highlights that would suggest the ship was catching light from the environment. It wasn’t baked into the model photography, so we had to add those glints manually, frame by frame, right onto the animation cels. Just little touches of light to make it feel like the ship belonged in that space.”

Animation and rotoscope supervisor Peter Kuran works with an animation camera during production of Star Wars: The Empire Strikes Back (1980) (Credit: Terry Chostner & ILM).

Kuran told The Filmumentaries Podcast, “I just thought that that was something that was needed.”

By the time Star Wars: Return of the Jedi (1983) came around, ILM was called on to create yet another iconic animated visual effect: Emperor Palpatine’s Force lightning. Composed of hand-drawn electrical arcs, the effect required animator Terry Windell to conjure a sense of living, dangerous energy – a visual shorthand for the raw power of the dark side. During his career, Windell brought his animation skills to Poltergeist (1982) and Ghostbusters (1984), among many others.

Though Peter Kuran had since left ILM, his company, Visual Concept Engineering, took on the painstaking task of rotoscoping each frame of the lightsaber combat between Luke and Vader. In total, 102 lightsaber shots were completed for the final film in the trilogy.

While rotoscoping and hand-drawn animation effects remained essential throughout the early 1980s, ILM was already looking ahead, seeking ways to evolve another time-honored technique: stop-motion animation.

As with the lightsabers and blaster bolts, the Emperor’s “Force lightning” in Star Wars: Return of the Jedi (1983) was also created with hand-drawn animation (Credit: ILM & Lucasfilm).

The Rise of Go-Motion

Before work began on Return of the Jedi, “Go-Motion” – a breakthrough in dimensional animation pioneered by ILM’s Dennis Muren, Phil Tippett, Stuart Ziff, and Tom St. Amand – offered a major refinement to traditional stop-motion by introducing motion blur, an effect crucial to achieving realistic movement. Unlike standard stop-motion, where models remain static during each frame’s exposure, go-motion employs stepper motors driven by a motion-control system. These motors subtly shift the puppet during the open-shutter phase, simulating the kind of motion blur found in live-action 24fps cinematography.

“The significance is that we got it working,” Ziff told Cinefex, downplaying the complexity of a system that required months of development before the first usable shot could be captured.

First explored during production on Star Wars: The Empire Strikes Back (1980) and fully realized on Dragonslayer (1981), the process eliminated the telltale staccato of conventional stop-motion. 

Ziff’s engineering expertise led to the development of a modular rig dubbed the “Dragon Mover,” which connected to the model’s limbs via rods and enabled precise, repeatable motion sequences. Tippett, St. Amand, and Ken Ralston meticulously animated both walking and flying versions of the puppet, blending mechanical precision with handcrafted nuance.

“We started off with some of the more complicated shots,” Tippett told Cinefex, recalling the weeks spent programming movement cycles before finally achieving a fluid, natural gait. This meant that the process became easier over time, a testament to the artists’ dual roles as problem solvers. The result was a new level of fluidity and realism, particularly evident in the scenes featuring the film’s dragon, Vermithrax Pejorative.

The Vermithrax Pejorative in Dragonslayer (1981) (Credit: ILM & Paramount).

Blending Animation with Live-Action: A New Frontier

ILM’s reputation for innovation took a significant leap forward with Who Framed Roger Rabbit. Directed by Robert Zemeckis, the film demanded the seamless integration of hand-drawn, cel-animated characters with live-action performances and practical on-set effects. ILM’s task was to anchor the animated characters convincingly in the real world.

Visual effects supervisor Ken Ralston oversaw the technical and creative challenges of making cartoon characters interact believably with real environments. “The animation had to exist in a real world, with real lighting, perspective, and interaction. That had never been done before at this level,” Ralston told Cinefex.

“It was great for me because I am a huge fan of those early cartoons by the great Warner Brothers directors, Tex Avery and Chuck Jones. And when that showed up with the intent that Bob [Zemeckis]  wanted for it, man, that was a match made in heaven. And it was brutal, but it was great at the same time. It keeps you going. And when you see results on something that’s finally coming together, it’s a blast,” Ralston explained to The Filmumentaries Podcast.

Marking a turning point in hybrid filmmaking, they also decided to discard the traditional locked-off camera in favor of dynamic movement. To support this, ILM developed new methods to track live-action camera motion and translate it into data that animators could use to maintain consistent character positioning and perspective. “The opening camera crane shot proved to be historic. … No one had ever done a crane drop with a live-action camera and planted an animated character firmly on the ground,” Zemeckis recalled to Cinefex.

An animation cel from Who Framed Roger Rabbit (1988), created by the team supervised by Richard Williams. ILM was then responsible for compositing the animated characters with the live-action footage (Credit: ILM & Disney).

ILM and the special effects team constructed practical rigs to simulate interactions between live-action props and invisible cartoon characters. In one sequence, when Roger Rabbit turns a water faucet, a hidden mechanism releases a perfectly timed spray – a practical effect used to sell the interaction.

To match the shifting light within live-action environments, ILM tracked moving shadows and highlights, ensuring the animated characters were illuminated just like the actors. “If a light in the scene was swinging, … then the Toon characters would have to be lit in exactly the same way,” said Ralston. Animators relied on detailed lighting references to maintain visual consistency frame by frame.

Performance presented its own challenges. Bob Hoskins, cast as Eddie Valiant, was required to act opposite characters that weren’t physically present. “What I had to do was spend hours developing a technique to actually see, hallucinate, virtually to conjure these characters up,” he told Cinefex. To assist, Charles Fleischer, the voice of Roger Rabbit, wore a full Roger costume off-camera and delivered his lines live. “Although he was on the other side of the camera, I was able to talk to him as if he were right next to me. We could even ad-lib together,” Hoskins said.

After principal photography wrapped, ILM tackled the complex process of optical compositing while Richard Williams’s animation team in London produced the character animation. ILM integrated those elements into the live-action footage. “Every frame had to go through multiple passes to create tone mattes, shadow mattes, and interactive lighting effects. It wasn’t just a matter of drawing the character,” explained optical supervisor Edward Jones. “Every single frame had to be drawn, rechecked, and composited with multiple elements to make sure the animation fit seamlessly into the live-action,” Zemeckis recalled.

The result was a groundbreaking fusion of animation and visual effects that redefined the possibilities in cinematic storytelling. It was a winning combination of traditional techniques and innovation that was widely praised. The film won Best Visual Effects and a Special Achievement Award at the 1989 Academy Awards. Many saw the film as the zenith of the photochemical era, even to the extent that it was perceived as too complex to repeat.

In fact, it wasn’t until a decade later that ILM revisited this hybrid format with The Adventures of Rocky and Bullwinkle (2000), applying many of the same techniques with enhanced digital compositing tools to a new generation of animated characters.

Actor Bob Hoskins (Eddie Valiant) is suspended before a blue screen on ILM’s main stage. In this sequence, his character interacts with animated co-stars Bugs Bunny and Mickey Mouse (Credit: ILM).

When Dinosaurs Ruled the Visual Effects World

While go-motion had proven a valuable innovation throughout the 1980s, it was the advent of computer graphics (CG) character animation that truly revolutionized ILM’s approach in the 1990s. In the last year of the decade, ILM laid the groundwork on James Cameron’s The Abyss (1989), animating the fully CG pseudopod – a water-based, tentacle-like entity. For Cameron’s next film, Terminator 2: Judgment Day (1991), ILM once again raised the bar with the liquid metal T-1000.

It was the digital dinosaurs in Jurassic Park that marked a true turning point – not just in terms of spectacle – but as a clear signal that traditional methods like stop-motion and go-motion were being eclipsed by a new era of photorealistic CG. ILM animator Steve Williams, who had previously worked with Mark Dippé on The Abyss and Terminator 2, pushed the idea of fully computer-rendered dinosaurs further. The results were astonishing. Steven Spielberg’s action-horror hybrid delivered creatures that felt real. Animals that moved and breathed with skin that stretched and muscles that flexed.

As a veteran stop-motion animator, Phil Tippett famously quipped at the time: “I’ve just become extinct.” The line – part joke, part reality – captured the profound shift unfolding across visual effects departments. Tippett’s line was given to the film’s Dr. Malcom, played by Jeff Goldblum.

A computer-graphics Brachiosaurus seen with live-action actors in the foreground in Jurassic Park (1993) (Credit: ILM & Universal).

By the time Jurassic Park hit screens, the industry had begun pivoting decisively toward digital techniques, a shift witnessed firsthand by animator Rob Coleman.

“There were only 6 animators at ILM for Jurassic Park,” he tells ILM.com. “It was the film that inspired me to cut my reel and send it in. … And I came in as ILM’s animator number 9 in October of ‘93 (4 months after the film’s release) when it was still very early days for computer graphics.”

To bridge the gap between stop-motion and computer animation, the team developed a hybrid technique known as the Dinosaur Input Device or D.I.D. This setup used a dinosaur armature fitted with sensors and encoders, allowing animators to physically manipulate the model while their movements were captured and translated into digital data. The goal was to combine the skill and experience of the traditional animators and strengths of the computer artists and technicians. While the results weren’t always ideal – much of the animation still had to be keyframed in the computer – it marked a pivotal step. The future of filmmaking was taking shape, frame by frame.

Animator Tom St. Amand (left) and lead animator Randy Dutra of the Tippett Studio pose with the Dinosaur Input Device (D.I.D.) used on Jurassic Park (Credit: ILM & Tippett Studio).

The Challenge of Digital Characters: The Star Wars Prequels

Following ILM’s work in the 1990s on films like The Flintstones (1994), Casper (1995), Forrest Gump (1994), and Jumanji (1995), George Lucas was getting ready to revisit the galaxy far, far away. This time, with a vision that demanded unprecedented integration of digital characters and live-action performances. The Star Wars prequels would become a proving ground for ILM’s rapidly expanding digital animation capabilities.

Leading that charge was Rob Coleman, by then an animation supervisor at ILM. He found himself tasked with something the company had never fully tackled before: nuanced, verbal performances from fully digital characters who needed to share the screen – and emotional space – with real actors.

“It was all those things, plus we didn’t have a staff that actually had spent their time learning how to do nuanced performances,” Coleman recalls. He would tell director Joe Johnston for Light & Magic Season 2 that it was Dragonheart (1996) that really set the groundwork. “That was a huge leap for us. George was watching, and when he saw Dragonheart, he said, … ‘We are ready to go.’

Draco the dragon (voiced by Sean Connery) flies towards Bowen (Dennis Quaid) in Dragonheart (1996) (Credit: ILM & Universal).

“Most of the people at ILM had been flying spaceships and doing robots and maybe having dinosaurs smash around,” Coleman adds, “but they weren’t doing verbal performances where they were to hold their own with Natalie Portman and Liam Neeson and Ewan McGregor.” And to bring multiple CG characters like Jar Jar Binks, Watto, and Sebulba to life in Star Wars: The Phantom Menace (1999), Coleman had to shift the team’s mindset. His growing team of 65 animators needed to think less like technicians and more like performers.

“We videotaped our actors so we had what they were doing physically, and we could look at them speaking to work out the lip sync. But pretty early on in Phantom Menace, I knew that I wanted to get into the subtext, not just the text. What’s going on inside the heads of the characters. If we could achieve that, we were gonna have believable performances, and the audience would have a connection with Watto, Jar Jar, Sebulba, and Boss Nass in that first film.”

The next major test came with Star Wars: Attack of the Clones (2002) and the digital resurrection of a beloved character: Yoda. Unlike Jar Jar or Watto, Yoda had already been established in the original trilogy as a practical puppet, sculpted by Stuart Freeborn and brought to life by puppeteer Frank Oz. Coleman’s team needed to preserve that legacy while updating the character with a broader range of expression.

“I went back and looked at Empire and it was nothing like I remembered because I’d grown up. It had changed what we expected,” Coleman says. “So what I was trying to achieve is what I remembered Yoda doing in terms of expressiveness and honoring how Frank moved him. Frank actually came by ILM, held up his hand, showed me the position of his fingers inside Yoda’s head. I had him pantomime some Yoda with me so I could see what he was doing.”

To ensure authenticity, Coleman and his team rigorously tested Yoda’s new digital incarnation. He recalls the moment he shared the first test with George Lucas. “There is footage of me presenting the first digital Yoda on the From Puppets to Pixels [2002] documentary. That is the real footage of me doing that, even though I asked the documentary not to shoot it. I’m so happy they did. I was really nervous, and I presented three speaking shots and three non-speaking shots on purpose because I was trying to show them that we could maintain performance without the crutch of dialogue. That was a focused decision because I knew from watching countless movies and TV, editors routinely cut to their action shots – the non-verbal reaction shot. I wanted to earn one of those, and we did.”

Jar Jar Binks (right, Ahmed Best) performs opposite Queen Amidala (Natalie Portman) in Star Wars: The Phantom Menace (1999) (Credit: ILM & Lucasfilm).

That approach paid off. One of Yoda’s most effective digital moments came not during a battle or speech but in a quiet reaction. “There’s a shot of Yoda in Palpatine’s office where Palpatine says something, Yoda’s leaving, and he turns, and he looks over his shoulder, and you can tell he doesn’t trust him,” Coleman notes. “And that’s all in facial performance, all keyframe, frame-by-frame animation. It ended up on the movie poster.”

Coleman’s work continued into Star Wars: Revenge of the Sith (2005), by which point ILM had solidified its reputation as a pioneer in digital character animation. The scope of the prequel work, in retrospect, still feels enormous to the animation director.“I kind of got swept up in it all. Jim Morris [ILM’s general manager from 1993 to 2005] had put me forward for the role. Jim had taken me aside, and he said, ‘I think you’ve got the right temperament to work with George.’ So he sent me over … and dropped me off in London for a two-week interview with George Lucas, which I passed.”

Decades later, Coleman is reflective about the experience. Even as ILM continued to push forward in their abilities to mimic life, it was paradoxically the artists themselves that felt like the imposters. “Twenty-five years on, it’s kind of surreal to think back that I actually did that. I know that’s me. There are pictures of a younger me doing it. And I have all the memories, but sometimes it feels like it was someone else.”

Animation director Rob Coleman at work on The Phantom Menace (Credit: ILM).

Cursed Flesh and Living Tentacles: The Pirates Breakthrough

When Pirates of the Caribbean: The Curse of the Black Pearl (2003) set sail, ILM faced a major challenge. Bringing the cursed crew of the Black Pearl to life wasn’t just about creating convincing skeletons – it was about making them believable next to live-action characters.

Hal Hickel, animation supervisor, explains to ILM.com that “It was a really complicated problem because the idea was that under moonlight these guys are skeletons, but in shadow, they’re flesh and blood.” Each shot became a complex blend of live-action photography and animation, requiring seamless transitions between the two. “You couldn’t just cut to them and show them in full skeletal form under neutral lighting,” he said. “It all had to be motivated by the lighting in the scene.”

The work paid off, but it was only the beginning. For the sequel, Pirates of the Caribbean: Dead Man’s Chest (2006), director Gore Verbinski raised the bar with Davy Jones and his crew. These characters were fully digital – and fully expected to carry the emotional weight of their scenes.

Speaking about Bill Nighy’s portrayal of Davy Jones, Hickel notes that “Bill gave such a brilliant performance. We didn’t want to lose any of the little stuff. The slight squint of an eye, the tiny sneer.” Rather than relying solely on motion capture, the team blended Nighy’s reference footage with keyframe animation, ensuring that none of his subtle acting choices were lost. “We wanted the tentacles to feel alive but they had to support the emotion in his face, not steal focus.”

Davy Jones (Bill Nighy) in Pirates of the Caribbean: Dead Man’s Chest (2006) (Credit: ILM & Disney).

Animating Davy Jones’s tentacle beard posed its own technical challenges. “It was a mix of hand animation and simulation,” Hickel explains. “We animated parts of it for performance reasons, but we also let physics take over for the secondary motion, so it didn’t look fake or overly choreographed.” This approach required close collaboration between animators, rigging artists, and the simulation team to keep everything feeling realistic and responsive.

The complexity of Davy Jones and his crew pushed ILM to overhaul their pipeline. “We had to rethink a lot of how we built and rendered these characters,” Hickel says. Advances made for Pirates laid the foundation for ILM’s later work on projects like Transformers (2007) and The Avengers (2012).

Beyond the technical achievements, Pirates also marked a shift in how digital characters were treated on screen. As Hickel puts it, “It wasn’t just about creating spectacle. Gore trusted us to handle real character beats with these CG characters. It was an amazing opportunity.” Through a mix of performance, artistry, and cutting-edge technology, ILM helped create one of cinema’s most memorable digital villains. They had steered animation into entirely new waters.

The Leap to Full-Length Animation: Rango

After working with Industrial Light & Magic on three Pirates of the Caribbean films, director Gore Verbinski approached the studio with an ambitious proposal: to produce a fully animated feature. He had been particularly impressed by ILM’s work on Davy Jones and believed the studio could bring that same level of sophistication to Rango – a surreal Western populated by anthropomorphic desert creatures.

“We approached Rango the way we approach live-action visual effects,” visual effects supervisor John Knoll told Cinefex, “building out environments with a cinematic mindset rather than adhering to the rigid, modular workflow of conventional animated features.”

A defining innovation was the film’s approach to lighting and cinematography. Renowned director of photography Roger Deakins consulted on the project, bringing principles of real-world filmmaking into the animated space. “We lit Rango the way we’d light a live-action film, with practical principles of cinematography in mind,” Deakins told Cinefex.

Rango‘s (2011) namesake, as voiced by Johnny Depp (Credit: ILM & Paramount).

ILM’s animation director, Hal Hickel, emphasized that they wanted the characters to inhabit their world with mass and texture. “We didn’t want our characters to feel overly polished or weightless,” he told Cinefex. “Gore wanted them to move with a slight awkwardness as if they truly existed in this dusty, unpredictable world.”

“He didn’t want to go head to head with Pixar or Disney or DreamWorks or Illumination. If they’re all over here, he wanted to go over there, aesthetically, in every way,” Hickel tells ILM.com. “Gore understood that the look of the film that he wanted to do was what we ended up calling ‘photographic.’ So not photoreal, but definitely not cartoony – the shot glass with whiskey in it, those kinds of things all had this patina of realism. So that seemed like a really good fit with us at ILM.”

Rather than using motion capture, Verbinski shot sessions with the actors performing together in a theatrical setting simply to inspire the animation. “It wasn’t about mapping motion one-to-one,” says Hickel. “It was about understanding the rhythm, the beats, the subtle mannerisms that would inform the final animated characters.” The result was a film that felt authored – visually distinct and emotionally resonant. For ILM, Rango marked another turning point.

“We knew this was an experiment,” said Knoll, “but we also knew it was an opportunity to redefine what ILM could do. Looking back, I think we did just that.”

Lead animator Maia Kayser at work on Rango (Credit: ILM).

Having left ILM before production on the film, Rob Coleman is still captivated by Rango. “It came about because John Knoll and Hal Hickel built a fantastic relationship with Gore Verbinski,” he says, “and they demonstrated to him through Pirates of the Caribbean that ILM had acting animators, and Gore is an actor’s director. They needed the right director with the right focus and the right mixture of talents and just bravado to say, ‘Yeah, we’re going to do this.’ And to hit ILM at the right time to make it, I think it’s still a marvel. I went back and watched it a couple years ago. It’s incredible what they did and what they achieved.”

“Every animator I know who worked on Rango had a ball and tells me continuously, ‘Gosh. Let’s get another Gore film going,’” says Hickel. “Yeah, they ate it up. He just really wanted people to feel like we were all filmmakers. You’re not the visual effects people up there, and I’m the filmmaker down here. We’re all filmmakers. We’re making this movie together.” That sense of collaboration was an ethos that ILM started in 1975 and continues to carry forward to this day.

Rango went on to win the Academy Award for Best Animated Feature in 2012.

Follow ILM’s continued journey in animated feature filmmaking in part two of this installment of ILM Evolutions.

Read more stories from our 50th anniversary series, “ILM Evolutions”:

ILM Evolutions: Pushing the Boundaries of Interactive Experiences

Jamie Benning is a filmmaker, author, and podcaster with a lifelong passion for sci-fi and fantasy cinema. He hosts The Filmumentaries Podcast, featuring twice-monthly interviews with behind-the-scenes artists. Visit Filmumentaries.com or find him on X (@jamieswb) and @filmumentaries on Threads, Instagram, and Facebook.

The visual effects supervisor from ILM’s Vancouver studio shares insights about helping create new characters and bringing the streets of New York City to life.

By Mark Newbold

(Credit: ILM & Marvel).

Proudly displaying the most famous typographical symbol since George Lucas placed an acute accent over the “e” in Padmé Amidala, Thunderbolts* arrived in cinemas on May 2, 2025, to a fanfare of critical praise, bringing together a gaggle of questionably motivated heroes, including Florence Pugh as Yelena Belova, Sebastian Stan as Bucky Barnes, David Harbour as Alexei Shostakov, Wyatt Russell as John Walker, Hannah John-Kamen as Ava Starr, Lewis Pullman as Bob Reynolds, Olga Kurylenko as Antonia Dreykov, and Julia Louis-Dreyfus as Valentina Allegra de Fontaine. Director Jake Schreier (who also helmed Star Wars: Skeleton Crew’s fifth episode) led the effort to create an adventure that thrills, engages, delights, and amuses in equal measures.

Thunderbolts* is a story that not only details the rise of a motley crew of rogues into the heroes of Manhattan but also the war Bob Reynolds fights internally as he battles to free himself from his dark alter ego, Void, with the help of his newfound friends. Industrial Light & Magic’s visual effects supervisor Chad Wiebe (Captain America: Brave New World [2025], Obi-Wan Kenobi [2022], Thor: Ragnarok [2017]), joins ILM.com to discuss the challenges of not only bringing a tentpole release to the big screen but also creating striking new effects for fans of the Marvel Cinematic Universe (MCU).

“ILM’s work started back in May 2023,” Wiebe tells ILM.com, “when development work began with Jay Cooper (visual effects supervisor on The Eternals [2021] and The Creator [2023]) and a small team of artists, primarily to develop the look of Void. Then the WGA [Writers Guild of America] strike happened, and production went on hiatus for a while. ILM’s involvement picked up again in February of 2024. That’s when I got involved.”

(Credit: ILM & Marvel).

There can be any number of elements that bring an experienced supervisor onto a show. Wiebe explains how appropriate skill sets, personal interests, and timing align when taking on a new show.

“The production visual effects supervisor Jake Morrison and I have worked together a number of times and it’s always been a very collaborative experience, so I jumped at the opportunity to work together again,” Wiebe says. “On top of that, Sentry is a powerful new character in the MCU with a strong comic book legacy, and I really enjoy developing ideas for new characters.

“This was a very different film to the ones you typically see within the MCU,” Wiebe continues. “Jake Schreier’s vision was that it had to be grounded and based on physicality, not magic and energy and all the things you typically associate with superhero films. He didn’t want it to feel like any movie that we’ve seen before, so that instantly attracted me.”

As with any Marvel project set within the five New York City boroughs of Manhattan, Brooklyn, Queens, The Bronx, and Staten Island, the city is essentially a character in its own right. Whether it’s Spider-Man in Queens, Captain America in Brooklyn, or the Baxter Building in Manhattan, each location must feel authentic. In Thunderbolts*, we return to the former Avengers Tower, which ILM had to place within real-world Manhattan.

“There are two ways to look at it,” explains Wiebe. “One aspect is the kind of data acquisition you need in order to make these very tangible environments look realistic as if you’re standing there yourself, and the other is augmenting it with some very iconic structures such as the Watchtower, which needs to sit seamlessly within that environment. It’s a tricky thing to do when it’s a city like New York that people are very familiar with. When you’re building locations and areas that have a real-world counterpart, you need to do your homework. You need to make sure you get all the reference material to make sure you’re depicting it in the most accurate way because people will instantly spot things if you’re trying to cheat or fudge the facts, and New York holds a very special place in people’s hearts, so doing it justice was very important.

“The fact that they based Avengers Tower around the MetLife Building in New York was a great starting point,” Wiebe continues. “The Avengers Tower we’ve seen in previous films retains the base of the MetLife building, but the departure that we took on Thunderbolts* was that we redesigned the base of the tower so it no longer utilized any of the MetLife Building. We use the same city block and footprint, but we replaced it from the ground up. Beyond Avengers Tower, we also had to build vast sections of the surrounding area. The key is in the details and making sure you collect enough reference material such as digital photography, aerial plates, LIDAR scans – the whole nine yards to get as much data as possible so we can build out this environment to be a true depiction of New York City.”


One of the most striking elements of Thunderbolts* is a new character in the MCU, Bob, and his dark alter ego, Void. Both thematically and visually, his soul-sucking powers are a powerful addition to the film, taking inspiration from both the comics and the film’s director, Jake Schreier. The task fell to the artists at ILM to bring these concepts to life.

“It was a unique challenge to visualize Void’s powers without leaning into anything too typical or too magical. The way Void turned people into shadowed silhouettes being a prime example. It needed to feel like a subtle but impactful event,” says Wiebe. “There were a surprising number of iterations that we went through to figure out that look. We spanned the full spectrum of ideas, going from something that felt like a single frame flash, to longer, drawn-out versions showing detailed shadows projected onto surfaces in a variety of different ways. We tried different aesthetics before we arrived on a quick but impactful effect that had a complexity to its simplicity, which also relied on the audio design to sell it as this somber but impactful moment.”

The process from concept to completion required numerous iterations and refinements.

“We started shadow dev with Jay Cooper all the way back in May of 2023, and that wasn’t too dissimilar from what we continued to do all the way up to the final months of the show,” Wiebe explains. “With a pivotal character such as Void, getting it to a 90% or 95% point of completion is the easy part, relatively speaking. It’s dialing those nuances in the last 5% or 10% that’s a very iterative and collaborative process.

“There were some key shots that went through dozens of iterations,” Wiebe continues. “How much of Lewis’s performance are we preserving? How much are we shrouding him in shadows? How much specularity do we want to retain from his costume? It’s a fine line. You want to ensure you’re staying true to the actor’s performance because it’s so well done, but you have this character that you also need to convey as a mysterious, shadowy void, so you want to add that mystery and aura surrounding him without going too far. There was a lot of back and forth to determine what that balance should be. Once you crack the code, then you’re good to start propagating that through your other shots, and then the dominoes fall a lot quicker. It’s an important part of the process that we need to go through to land on that final look.”

With Lewis Pullman’s performance at the heart of the sequence, Void required a mix of disciplines to bring the character to life.

“A lot of what you see of Void relied heavily on a 2D composite treatment, mixed with our CG asset when we needed to add specific details to certain areas….so it’s a hybrid approach,” notes Wiebe. “We utilized as much plate material of Lewis as we could. We also augmented it to get some of the details that you may not have had in the plate. If there are areas that we want to expose, say a little bit of costume detail or parts of his cheek that we want to expose a bit of lighting information on, we would utilize our digital asset to help with that. For some of the wider shots where he’s further away or doing things that you couldn’t necessarily do while filming, those would be our digital versions.”


Work on projects like Thunderbolts*, with bespoke visual effects crafted for specific characters and powers, can lead to processes that are useful in future projects, something Wiebe is grateful for.

“Every project adds new tools to your tool belt that you can take from show to show. That’s what you build on, and that’s what you can offer up as things that you’ve already tried and have experience with. I’ve done a number of Marvel films, and there’s always a carryover of techniques, setups, and lessons that you learn from doing things a certain way that you try to improve the next time.

“In regard to Sentry (before he turns into Void), we really don’t know everything that he’s capable of yet, and I don’t think he does either,” Wiebe continues, “so a big part of Thunderbolts* was him figuring out what he was able to do and learning the extent of his powers. One of the key moments in the film was when he said to Valentina that he doesn’t need to take orders from her; why would a god take orders from a human? There were a lot of conversations about Sentry’s level of confidence and his attitude when he started realizing that he’s got these incredible powers. There was some exploration about how confident he should feel. Jake didn’t want him to necessarily come across as overly confident in his powers because he’s learning them from scratch, but he also wanted to play into Bob’s character, too, where he wasn’t a very assertive person. He obviously fell on hard times before becoming Sentry, so navigating through that was a bit of a challenge for him.”

Thunderbolts* doesn’t just feature Sentry. There’s also a burgeoning team of would-be superheroes to contend with. “Obviously, here you’re putting him up against a number of characters that have their own unique powers, and there are other superheroes that he shares attributes with,” Wiebe explains. “That was a consideration in this film, making sure we don’t mimic what’s already been seen with other characters. Sentry has unlimited powers; he can do a bit of what most of the other superheroes can do, so making sure that we didn’t share too much space with other distinct effects was key.”

Creating visual effects requires intense attention to detail and the necessity of watching a scene again and again and again, being as granular as possible to get everything exactly where it needs to be. Given that, Wiebe notes that because he already knows the story, “it can make it a bit more difficult to sit back and enjoy a project that you worked on at the movie theater,” as he puts it. “But the beauty of Thunderbolts* is that everything was so seamless; I was able to let it visually play out without any moments of scrutiny or second-guessing the decisions we made. It was very rewarding to finally see it on the big screen in all its glory with the final audio in a theatre full of people who were very excited to see it. There were people cheering and applauding at the end of the screening, which was super, super rewarding.”


The film is planned, shot, edited, the visual effects completed, the sound layered on, and the music scored, but looking back on Thunderbolts*, there’s a key scene that stands out: the fight in the former Avengers Tower between Sentry and the Thunderbolts which is one continuous shot.

“We shot the Penthouse fight in three sections and spent months doing previs to map out where the cameras needed to be and determine our capacity to shoot within a confined set environment,” Wiebe explains. “When Sentry ‘Force pushes’ Red Guardian through the window and back, it’s one continuous 45-second shot up until the moment he throws both Ghost and Walker out of frame. That was months and months of pre-production followed by months and months of post-production work. It really was a labor of love between a number of different departments within ILM and everyone who was on set making it happen. In terms of things that we’re the proudest of, that oner is definitely right up there and something we’re promoting in order to help pull back the curtain and let people see all the work that went into it.”

(Credit: ILM & Marvel).

Mark Newbold has contributed to Star Wars Insider magazine since 2006, is a 4-time Star Wars Celebration stage host, avid podcaster, and the Editor-in-Chief of FanthaTracks.com. Online since 1996. You can find this Hoopy frood online @Prefect_Timing.

Guided by visual effects supervisor John Knoll, ILM embraced continually evolving methodologies to craft breathtaking visual effects for the iconic space battles in First Contact and Rogue One.

By Jay Stobie

Visual effects supervisor John Knoll (right) confers with modelmakers Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact (Credit: ILM).

Bolstered by visual effects from Industrial Light & Magic, Star Trek: First Contact (1996) and Rogue One: A Star Wars Story (2016) propelled their respective franchises to new heights. While Star Trek Generations (1994) welcomed Captain Jean-Luc Picard’s (Patrick Stewart) crew to the big screen, First Contact stood as the first Star Trek feature that did not focus on its original captain, the legendary James T. Kirk (William Shatner). Similarly, though Rogue One immediately preceded the events of Star Wars: A New Hope (1977), it was set apart from the episodic Star Wars films and launched an era of storytelling outside of the main Skywalker saga that has gone on to include Solo: A Star Wars Story (2018), The Mandalorian (2019-23), Andor (2022-25), Ahsoka (2023), The Acolyte (2024), and more.

The two films also shared a key ILM contributor, John Knoll, who served as visual effects supervisor on both projects, as well as an executive producer on Rogue One. Currently, ILM’s executive creative director and senior visual effects supervisor, Knoll – who also conceived the initial framework for Rogue One’s story – guided ILM as it brought its talents to bear on these sci-fi and fantasy epics. The work involved crafting two spectacular starship-packed space clashes – First Contact’s Battle of Sector 001 and Rogue One’s Battle of Scarif. Although these iconic installments were released roughly two decades apart, they represent a captivating case study of how ILM’s approach to visual effects has evolved over time. With this in mind, let’s examine the films’ unforgettable space battles through the lens of fascinating in-universe parallels and the ILM-produced fleets that face off near Earth and Scarif.

A final frame from the Battle of Scarif in Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm).

A Context for Conflict

In First Contact, the United Federation of Planets – a 200-year-old interstellar government consisting of more than 150 member worlds – braces itself for an invasion by the Borg – an overwhelmingly powerful collective composed of cybernetic beings who devastate entire planets by assimilating their biological populations and technological innovations. The Borg only send a single vessel, a massive cube containing thousands of hive-minded drones and their queen, pushing the Federation’s Starfleet defenders to Earth’s doorstep. Conversely, in Rogue One, the Rebel Alliance – a fledgling coalition of freedom fighters – seeks to undermine and overthrow the stalwart Galactic Empire – a totalitarian regime preparing to tighten its grip on the galaxy by revealing a horrifying superweapon. A rebel team infiltrates a top-secret vault on Scarif in a bid to steal plans to that battle station, the dreaded Death Star, with hopes of exploiting a vulnerability in its design.

On the surface, the situations could not seem to be more disparate, particularly in terms of the Federation’s well-established prestige and the Rebel Alliance’s haphazardly organized factions. Yet, upon closer inspection, the spaceborne conflicts at Earth and Scarif are linked by a vital commonality. The threat posed by the Borg is well-known to the Federation, but the sudden intrusion upon their space takes its defenses by surprise. Starfleet assembles any vessel within range – including antiquated Oberth-class science ships – to intercept the Borg cube in the Typhon Sector, only to be forced back to Earth on the edge of defeat. The unsanctioned mission to Scarif with Jyn Erso (Felicity Jones) and Cassian Andor (Diego Luna) and the sudden need to take down the planet’s shield gate propels the Rebel Alliance fleet into rushing to their rescue with everything from their flagship Profundity to GR-75 medium transports. Whether Federation or Rebel Alliance, these fleets gather in last-ditch efforts to oppose enemies who would embrace their eradication – the Battles of Sector 001 and Scarif are fights for survival.

From Physical to Digital

By the time Jonathan Frakes was selected to direct First Contact, Star Trek’s reliance on constructing traditional physical models (many of which were built by ILM) for its features was gradually giving way to innovative computer graphics (CG) models, resulting in the film’s use of both techniques. “If one of the ships was to be seen full-screen and at length,” associate visual effects supervisor George Murphy told Cinefex’s Kevin H. Martin, “we knew it would be done as a stage model. Ships that would be doing a lot of elaborate maneuvers in space battle scenes would be created digitally.” In fact, physical and CG versions of the U.S.S. Enterprise-E appear in the film, with the latter being harnessed in shots involving the vessel’s entry into a temporal vortex at the conclusion of the Battle of Sector 001.

Despite the technological leaps that ILM pioneered in the decades between First Contact and Rogue One, they considered filming physical miniatures for certain ship-related shots in the latter film. ILM considered filming physical miniatures for certain ship-related shots in Rogue One. The feature’s fleets were ultimately created digitally to allow for changes throughout post-production. “If it’s a photographed miniature element, it’s not possible to go back and make adjustments. So it’s the additional flexibility that comes with the computer graphics models that’s very attractive to many people,” John Knoll relayed to writer Jon Witmer at American Cinematographer’s TheASC.com.

However, Knoll aimed to develop computer graphics that retained the same high-quality details as their physical counterparts, leading ILM to employ a modern approach to a time-honored modelmaking tactic. “I also wanted to emulate the kit-bashing aesthetic that had been part of Star Wars from the very beginning, where a lot of mechanical detail had been added onto the ships by using little pieces from plastic model kits,” explained Knoll in his chat with TheASC.com. For Rogue One, ILM replicated the process by obtaining such kits, scanning their parts, building a computer graphics library, and applying the CG parts to digitally modeled ships. “I’m very happy to say it was super-successful,” concluded Knoll. “I think a lot of our digital models look like they are motion-control models.”

John Knoll (second from left) confers with Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact (Credit: ILM).

Legendary Lineages

In First Contact, Captain Picard commanded a brand-new vessel, the Sovereign-class U.S.S. Enterprise-E, continuing the celebrated starship’s legacy in terms of its famous name and design aesthetic. Designed by John Eaves and developed into blueprints by Rick Sternbach, the Enterprise-E was built into a 10-foot physical model by ILM model project supervisor John Goodson and his shop’s talented team. ILM infused the ship with extraordinary detail, including viewports equipped with backlit set images from the craft’s predecessor, the U.S.S. Enterprise-D. For the vessel’s larger windows, namely those associated with the observation lounge and arboretum, ILM took a painstakingly practical approach to match the interiors shown with the real-world set pieces. “We filled that area of the model with tiny, micro-scale furniture,” Goodson informed Cinefex, “including tables and chairs.”

Rogue One’s rebel team initially traversed the galaxy in a U-wing transport/gunship, which, much like the Enterprise-E, was a unique vessel that nonetheless channeled a certain degree of inspiration from a classic design. Lucasfilm’s Doug Chiang, a co-production designer for Rogue One, referred to the U-wing as the film’s “Huey helicopter version of an X-wing” in the Designing Rogue One bonus featurette on Disney+ before revealing that, “Towards the end of the design cycle, we actually decided that maybe we should put in more X-wing features. And so we took the X-wing engines and literally mounted them onto the configuration that we had going.” Modeled by ILM digital artist Colie Wertz, the U-wing’s final computer graphics design subtly incorporated these X-wing influences to give the transport a distinctive feel without making the craft seem out of place within the rebel fleet.

While ILM’s work on the Enterprise-E’s viewports offered a compelling view toward the ship’s interior, a breakthrough LED setup for Rogue One permitted ILM to obtain realistic lighting on actors as they looked out from their ships and into the space around them. “All of our major spaceship cockpit scenes were done that way, with the gimbal in this giant horseshoe of LED panels we got from [equipment vendor] VER, and we prepared graphics that went on the screens,” John Knoll shared with American Cinematographer’s Benjamin B and Jon D. Witmer. Furthermore, in Disney+’s Rogue One: Digital Storytelling bonus featurette, visual effects producer Janet Lewin noted, “For the actors, I think, in the space battle cockpits, for them to be able to see what was happening in the battle brought a higher level of accuracy to their performance.”

The U.S.S. Enterprise-E in Star Trek: First Contact (Credit: Paramount).

Familiar Foes

To transport First Contact’s Borg invaders, John Goodson’s team at ILM resurrected the Borg cube design previously seen in Star Trek: The Next Generation (1987) and Star Trek: Deep Space Nine (1993), creating a nearly three-foot physical model to replace the one from the series. Art consultant and ILM veteran Bill George proposed that the cube’s seemingly straightforward layout be augmented with a complex network of photo-etched brass, a suggestion which produced a jagged surface and offered a visual that was both intricate and menacing. ILM also developed a two-foot motion-control model for a Borg sphere, a brand-new auxiliary vessel that emerged from the cube. “We vacuformed about 15 different patterns that conformed to this spherical curve and covered those with a lot of molded and cast pieces. Then we added tons of acid-etched brass over it, just like we had on the cube,” Goodson outlined to Cinefex’s Kevin H. Martin.

As for Rogue One’s villainous fleet, reproducing the original trilogy’s Death Star and Imperial Star Destroyers centered upon translating physical models into digital assets. Although ILM no longer possessed A New Hope’s three-foot Death Star shooting model, John Knoll recreated the station’s surface paneling by gathering archival images, and as he spelled out to writer Joe Fordham in Cinefex, “I pieced all the images together. I unwrapped them into texture space and projected them onto a sphere with a trench. By doing that with enough pictures, I got pretty complete coverage of the original model, and that became a template upon which to redraw very high-resolution texture maps. Every panel, every vertical striped line, I matched from a photograph. It was as accurate as it was possible to be as a reproduction of the original model.”

Knoll’s investigative eye continued to pay dividends when analyzing the three-foot and eight-foot Star Destroyer motion-control models, which had been built for A New Hope and Star Wars: The Empire Strikes Back (1980), respectively. “Our general mantra was, ‘Match your memory of it more than the reality,’ because sometimes you go look at the actual prop in the archive building or you look back at the actual shot from the movie, and you go, ‘Oh, I remember it being a little better than that,’” Knoll conveyed to TheASC.com. This philosophy motivated ILM to combine elements from those two physical models into a single digital design. “Generally, we copied the three-footer for details like the superstructure on the top of the bridge, but then we copied the internal lighting plan from the eight-footer,” Knoll explained. “And then the upper surface of the three-footer was relatively undetailed because there were no shots that saw it closely, so we took a lot of the high-detail upper surface from the eight-footer. So it’s this amalgam of the two models, but the goal was to try to make it look like you remember it from A New Hope.”

A final frame from Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm).

Forming Up the Fleets

In addition to the U.S.S. Enterprise-E, the Battle of Sector 001 debuted numerous vessels representing four new Starfleet ship classes – the Akira, Steamrunner, Saber, and Norway – all designed by ILM visual effects art director Alex Jaeger. “Since we figured a lot of the background action in the space battle would be done with computer graphics ships that needed to be built from scratch anyway, I realized that there was no reason not to do some new designs,” John Knoll told American Cinematographer writer Ron Magid. Used in previous Star Trek projects, older physical models for the Oberth and Nebula classes were mixed into the fleet for good measure, though the vast majority of the armada originated as computer graphics.

Over at Scarif, ILM portrayed the Rebel Alliance forces with computer graphics models of fresh designs (the MC75 cruiser Profundity and U-wings), live-action versions of Star Wars Rebels VCX-100 light freighter Ghost and Hammerhead corvettes, and Star Wars staples (Nebulon-B frigates, X-wings, Y-wings, and more). These ships face off against two Imperial Star Destroyers and squadrons of TIE fighters, and – upon their late arrival to the battle – Darth Vader’s Star Destroyer and the Death Star. The Tantive IV, a CR90 corvette more popularly referred to as a blockade runner, made its own special cameo at the tail end of the fight. As Princess Leia Organa’s (Carrie Fisher and Ingvild Deila) personal ship, the Tantive IV received the Death Star plans and fled the scene, destined to be captured by Vader’s Star Destroyer at the beginning of A New Hope. And, while we’re on the subject of intricate starship maneuvers and space-based choreography…

Although the First Contact team could plan visual effects shots with animated storyboards, ILM supplied Gareth Edwards with a next-level virtual viewfinder that allowed the director to select his shots by immersing himself among Rogue One’s ships in real time. “What we wanted to do is give Gareth the opportunity to shoot his space battles and other all-digital scenes the same way he shoots his live-action. Then he could go in with this sort of virtual viewfinder and view the space battle going on, and figure out what the best angle was to shoot those ships from,” senior animation supervisor Hal Hickel described in the Rogue One: Digital Storytelling featurette. Hickel divulged that the sequence involving the dish array docking with the Death Star was an example of the “spontaneous discovery of great angles,” as the scene was never storyboarded or previsualized.

Visual effects supervisor John Knoll with director Gareth Edwards during production of Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm).

Tough Little Ships

The Federation and Rebel Alliance each deployed “tough little ships” (an endearing description Commander William T. Riker [Jonathan Frakes] bestowed upon the U.S.S. Defiant in First Contact) in their respective conflicts, namely the U.S.S. Defiant from Deep Space Nine and the Tantive IV from A New Hope. VisionArt had already built a CG Defiant for the Deep Space Nine series, but ILM upgraded the model with images gathered from the ship’s three-foot physical model. A similar tactic was taken to bring the Tantive IV into the digital realm for Rogue One. “This was the Blockade Runner. This was the most accurate 1:1 reproduction we could possibly have made,” model supervisor Russell Paul declared to Cinefex’s Joe Fordham. “We did an extensive photo reference shoot and photogrammetry re-creation of the miniature. From there, we built it out as accurately as possible.” Speaking of sturdy ships, if you look very closely, you can spot a model of the Millennium Falcon flashing across the background as the U.S.S. Defiant makes an attack run on the Borg cube at the Battle of Sector 001!

Exploration and Hope

The in-universe ramifications that materialize from the Battles of Sector 001 and Scarif are monumental. The destruction of the Borg cube compels the Borg Queen to travel back in time in an attempt to vanquish Earth before the Federation can even be formed, but Captain Picard and the Enterprise-E foil the plot and end up helping their 21st century ancestors make “first contact” with another species, the logic-revering Vulcans. The post-Scarif benefits take longer to play out for the Rebel Alliance, but the theft of the Death Star plans eventually leads to the superweapon’s destruction. The Galactic Civil War is far from over, but Scarif is a significant step in the Alliance’s effort to overthrow the Empire.

The visual effects ILM provided for First Contact and Rogue One contributed significantly to the critical and commercial acclaim both pictures enjoyed, a victory reflecting the relentless dedication, tireless work ethic, and innovative spirit embodied by visual effects supervisor John Knoll and ILM’s entire staff. While being interviewed for The Making of Star Trek: First Contact, actor Patrick Stewart praised ILM’s invaluable influence, emphasizing, “ILM was with us, on this movie, almost every day on set. There is so much that they are involved in.” And, regardless of your personal preferences – phasers or lasers, photon torpedoes or proton torpedoes, warp speed or hyperspace – perhaps Industrial Light & Magic’s ability to infuse excitement into both franchises demonstrates that Star Trek and Star Wars encompass themes that are not competitive, but compatible. After all, what goes together better than exploration and hope?

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

ILM teams with Ben Stiller and Apple TV+ to bring thousands of seamless visual effects shots to the hit drama’s second season.

By Clayton Sandell

There are mysterious and important secrets to be uncovered in the second season of the wildly popular Apple TV+ series Severance (2022-present).

About 3,500 of them are hiding in plain sight.

That’s roughly the number of visual effects shots helping tell the Severance story over 10 gripping episodes in the latest season, a collaborative effort led by Industrial Light & Magic.

ILM’s Eric Leven served as the Severance season two production visual effects supervisor. We asked him to help pull back the curtain on some of the show’s impressive digital artistry that most viewers will probably never notice.

“This is the first show I’ve ever done where it’s nothing but invisible effects,” Leven tells ILM.com. “It’s a really different calculus because nobody talks about them. And if you’ve done them well, they are invisible to the naked eye.”

With so many season two shots to choose from, Leven helped us narrow down a list of his favorite visual effects sequences to five. (As a bonus, we’ll also dive into an iconic season finale shot featuring the Mr. Milchick-led marching band.)

Before we dig in, a word of caution. This article contains plot spoilers for Severance. (And in case you’re already wondering: No, the goats are not computer-graphics.)

Severance tells the story of Mark Scout (Adam Scott), department chief of the secretive Severed Floor located in the basement level of Lumon Industries, a multinational biotech corporation. Mark S., as he’s known to his co-workers, heads up Macrodata Refinement (MDR), a department where employees help categorize numbers without knowing the true purpose of their work. 

Mark and his team – Helly R. (Britt Lower), Dylan G. (Zach Cherry), and Irving B. (John Turturro), have all undergone a surgical procedure to “sever” their personal lives from their work lives. The chip embedded in their brains effectively creates two personalities that are sometimes at odds: an “Innie” during Lumon office hours and an “Outie” at home.

“This is the first show I’ve ever done where it’s nothing but invisible effects. It’s a really different calculus because nobody talks about them. And if you’ve done them well, they are invisible to the naked eye.”

Eric Leven

1. The Running Man (Episode 201: “Hello, Ms. Cobel”)

The season one finale ends on a major cliffhanger. Mark S. learns that his Outie’s wife, Gemma – believed killed in a car crash years ago – is actually alive somewhere inside the Lumon complex. Season two opens with Mark S. arriving at the Severed Floor in a desperate search for Gemma, who he only knows as her Innie persona, Ms. Casey.

The fast-paced sequence is designed to look like a single, two-minute shot. It begins with the camera making a series of rapid and elaborate moves around a frantic Mark S. as he steps out of the elevator, into the Severed Floor lobby, and begins running through the hallways.

“The nice thing about that sequence was that everyone knew it was going to be difficult and challenging,” Leven says, adding that executive producer and Episode 201 director, Ben Stiller, began by mapping out the hallway run with his team. Leven recommended that a previsualization sequence – provided by The Third Floor – would help the filmmakers refine their plan before cameras rolled.

“While prevising it, we didn’t worry about how we would actually photograph anything. It was just, ‘These are the visuals we want to capture,’” Leven says. “‘What does it look like for this guy to run down this hallway for two minutes? We’ll figure out how to shoot it later.’”

The previs process helped determine how best to shoot the sequence, and also informed which parts of the soundstage set would have to be digitally replaced. The first shot was captured by a camera mounted on a Bolt X Cinebot motion-control arm provided by The Garage production company. The size of the motion-control setup, however, meant it could not fit in the confined space of an elevator or the existing hallways.

“We couldn’t actually shoot in the elevator,” Leven says. “The whole elevator section of the set was removed and was replaced with computer graphics [CG].” In addition to the elevator, ILM artists replaced portions of the floor, furniture, and an entire lobby wall, even adding a reflection of Adam Scott into the elevator doors.

As Scott begins running, he’s picked up by a second camera mounted on a more compact, stabilized gimbal that allows the operator to quickly run behind and sometimes in front of the actor as he darts down different hallways. ILM seamlessly combined the first two Mark S. plates in a 2D composite.

“Part of that is the magic of the artists at ILM who are doing that blend. But I have to give credit to Adam Scott because he ran the same way in both cameras without really being instructed,” says Leven. “Lucky for us, he led with the same foot. He used the same arm. I remember seeing it on the set, and I did a quick-and-dirty blend right there and thought, ‘Oh my gosh, this is going to work.’ So it was really nice.”

The action continues at a frenetic pace, ultimately combining ten different shots to complete the sequence.

“We didn’t want the very standard sleight of hand that you’ve seen a lot where you do a wipe across the white hallway,” Leven explains. “We tried to vary that as much as possible because we didn’t want to give away the gag. So, there are times when the camera will wipe across a hallway, and it’s not a computer graphics wipe. We’d hide the wipe somewhere else.”

A slightly more complicated illusion comes as the camera sweeps around Mark S. from back to front as he barrels down another long hallway. “There was no way to get the camera to spin around Mark while he is running because there’s physically not enough room for the camera there,” says Leven.

To capture the shot, Adam Scott ran on a treadmill placed on a green screen stage as the camera maneuvered around him. At that point, the entire hallway environment is made with computer graphics. Artists even added a few extra frames of the actor to help connect one shot to the next, selling the illusion of a single continuous take. “We painted in a bit of Adam Scott running around the corner. So if you freeze and look through it, you’ll see a bit of his heel. He never completely clears the frame,” Leven points out.

Leven says ILM also provided Ben Stiller with options when it came to digitally changing up the look of Lumon’s sterile hallways: sometimes adding extra doors, vents, or even switching door handles. “I think Ben was very excited about having this opportunity,” says Leven. “He had never had a complete, fully computer graphics version of these hallways before. And now he was able to do things that he was never able to do in season one.”

(Credit: Apple TV+).

2. Let it Snow (Episode 204: “Woe’s Hollow”)

The MDR team – Mark, Helly, Dylan, and Irving – unexpectedly find themselves in the snowy wilderness as part of a two-day Lumon Outdoor Retreat and Team-Building Occurrence, or ORTBO. 

Exterior scenes were shot on location at Minnewaska State Park Preserve in New York. Throughout the ORTBO sequence, ILM performed substantial environment enhancements, making trees and landscapes appear far snowier than they were during the shoot. “It’s really nice to get the actors out there in the cold and see their breath,” Leven says. “It just wasn’t snowy during the shoot. Nearly every exterior shot was either replaced or enhanced with snow.”

For a shot of Irving standing on a vast frozen lake, for example, virtually every element in the location plate – including an unfrozen lake, mountains, and trees behind actor John Turturro – was swapped out for a CG environment. Wide shots of a steep, rocky wall Irving must scale to reach his co-workers were also completely digital.

Eventually, the MDR team discovers a waterfall that marks their arrival at a place called Woe’s Hollow. The location – the state park’s real-life Awosting Falls – also got extensive winter upgrades from ILM, including much more snow covering the ground and trees, an ice-covered pond, and hundreds of icicles clinging to the rocky walls. “To make it fit in the world of Severance, there’s a ton of work that has to happen,” Leven tells ILM.com.

(Credit: Apple TV+).

3. Welcome to Lumon (Episode 202: “Goodbye, Mrs. Selvig” & Episode 203: “Who is Alive?”)

The historic Bell Labs office complex, now known as Bell Works in Holmdel Township, New Jersey, stands in as the fictional Lumon Industries headquarters building.

Exterior shots often underwent a significant digital metamorphosis, with artists transforming areas of green grass into snow-covered terrain, inserting a CG water tower, and rendering hundreds of 1980s-era cars to fill the parking lot.

“We’re always adding cars, we’re always adding snow. We’re changing, subtly, the shape and the layout of the design,” says Leven. “We’re seeing new angles that we’ve never seen before. On the roof of Lumon, for example, the air conditioning units are specifically designed and created with computer graphics.”

In real life, the complex is surrounded by dozens of houses, requiring the digital erasure of entire neighborhoods. “All of that is taken out,” Leven explains. “CG trees are put in, and new mountains are put in the background.”

Episodes 202 and 203 feature several night scenes shot from outside the building looking in. In one sequence, a camera drone flying outside captured a long tracking shot of Helena Eagan (Helly R.’s Outie) making her way down a glass-enclosed walkway. The building’s atrium can be seen behind her, complete with a massive wall sculpture depicting company founder Kier Eagan.

“We had to put the Kier sculpture in with the special lighting,” Leven reveals. “The entire atrium was computer graphics.” Artists completed the shot by adding CG reflections of the snowy parking lot to the side of the highly reflective building.

“We have to replace what’s in the reflections because the real reflection is a parking lot with no snow or a parking lot with no cars,” explains Leven. “We’re often replacing all kinds of stuff that you wouldn’t think would need to be replaced.”

Another nighttime scene shot from outside the building features Helena in a conference room overlooking the Lumon parking lot, which sits empty except for Mr. Milchick (Tramell Tillman) riding in on his motorcycle.

“The top story, where she is standing, was practical,” says Leven, noting the shot was also captured using a drone hovering outside the window. “The second story below her was all computer graphics. Everything other than the building is computer graphics. They did shoot a motorcycle on location, getting as much practical reference as possible, but then it had to be digitally replaced after the fact to make it work with the rest of the shot.”

(Credit: Apple TV+).

4. Time in Motion (Episode 207: “Chikhai Bardo”)

Episode seven reveals that MDR’s progress is being monitored by four dopplegang-ish observers in a control room one floor below, revealed via a complex move that has the camera traveling downward through a mass of data cables.

“They built an oversize cable run, and they shot with small probe lenses. Visual effects helped by blending several plates together,” explains Leven. “It was a collaboration between many different departments, which was really nice. Visual effects helped with stuff that just couldn’t be shot for real. For example, when the camera exits the thin holes of the metal grate at the bottom of the floor, that grate is computer graphics.”

The sequence continues with a sweeping motion-control time-lapse shot that travels around the control-room observers in a spiral pattern, a feat pulled off with an ingenious mix of technical innovation and old-school sleight of hand.

A previs sequence from The Third Floor laid out the camera move, but because the Bolt arm motion-control rig could only travel on a straight track and cover roughly one-quarter of the required distance, The Garage came up with a way to break the shot into multiple passes. The passes would later be stitched together into one seemingly uninterrupted movement.

The symmetrical set design – including the four identical workstations – helped complete the illusion, along with a clever solution that kept the four actors in the correct position relative to the camera.

“The camera would basically get to the end of the track,” Leven explains. “Then everybody would switch positions 90 degrees. Everyone would get out of their chairs and move. The camera would go back to one, and it would look like one continuous move around in a circle because the room is perfectly symmetrical, and everything in it is perfectly symmetrical. We were able to move the actors, and it looks like the camera was going all the way around the room.”

The final motion-control move switches from time-lapse back to real time as the camera passes by a workstation and reveals Mr. Drummond (Ólafur Darri Ólafsson) and Dr. Mauer (Robby Benson) standing behind it. Leven notes that each pass was completed with just one take.

5. Mark vs. Mark (Episode 210: “Cold Harbor”)

The Severance season two finale begins with an increasingly tense conversation between Innie Mark and Outie Mark, as the two personas use a handheld video camera to send recorded messages back and forth. Their encounter takes place at night in a Lumon birthing cabin equipped with a severance threshold that allows Mark S. to become Mark Scout each time he steps outside and onto the balcony.

The cabin set was built on a soundstage at York Studios in the Bronx, New York. The balcony section consisted of the snowy floor, two chairs, and a railing, all surrounded by a blue screen background. Everything else was up to ILM to create.

“It was nice to have Ben’s trust that we could just do it,” Leven remembers. “He said, ‘Hey, you’re just going to make this look great, right?’ We said, ‘Yeah, no problem.’”

Artists filled in the scene with CG water, mountains, and moonlight to match the on-set lighting and of course, more snow. As Mark Scout steps onto the balcony, the camera pulls back to a wide shot, revealing the cabin’s full exterior. “They built a part of the exterior of the set. But everything other than the windows, even the railing, was digitally replaced,” Leven says.

“It was nice to have Ben [Stiller’s] trust that we could just do it. He said, ‘Hey, you’re just going to make this look great, right?’ We said, ‘Yeah, no problem.’”

Eric Leven

Bonus: Marching Band Magic (Episode 210: “Cold Harbor”)

Finally, our bonus visual effects shot appears roughly halfway through the season finale. To celebrate Mark S. completing the Cold Harbor file, Mr. Milchick orders up a marching band from Lumon’s Choreography and Merriment department. Band members pour into MDR, but Leven says roughly 15 to 20 shots required adding a few more digital duplicates. “They wanted it to look like MDR was filled with band members. And for several of the shots there were holes in there. It just didn’t feel full enough,” he says.

In a shot featuring a God’s-eye view of MDR, band members hold dozens of white cards above their heads, forming a giant illustration of a smiling Mark S. with text that reads “100%.”

“For the top shot, we had to find a different stage because the MDR ceiling is only about eight feet tall,” recalls Leven. “And Ben really pushed to have it done practically, which I think was the right call because you’ve already got the band members, you’ve made the costumes, you’ve got the instruments. Let’s find a place to shoot it.”

To get the high shot, the production team set up on an empty soundstage, placing signature MDR-green carpet on the floor. A simple foam core mock-up of the team’s desks occupied the center of the frame, with the finished CG versions added later.

Even without the restraints of the practical MDR walls and ceiling, the camera could only get enough height to capture about 30 band members in the shot. So the scene was digitally expanded, with artists adding more green carpet, CG walls, and about 50 more band members.

“We painted in new band members, extracting what we could from the practical plate,” Leven says. “We moved them around; we added more, just to make it look as full as Ben wanted.” Every single white card in the shot, Leven points out, is completely digital.

(Credit: Apple TV+).

A Mysterious and Important Collaboration

With fans now fiercely debating the many twists and turns of Severance season two, Leven is quick to credit ILM’s two main visual effects collaborators: east side effects and Mango FX INC, as well as ILM studios and artists around the globe, including San Francisco, Vancouver, Singapore, Sydney, and Mumbai.

Leven also believes Severance ultimately benefited from a successful creative partnership between ILM and Ben Stiller.

“This one clicked so well, and it really made a difference on the show,” Leven says. “I think we both had the same sort of visual shorthand in terms of what we wanted things to look like. One of the things I love about working with Ben is that he’s obviously grounded in reality. He wants to shoot as much stuff real as possible, but then sometimes there’s a shot that will either come to him late or he just knows is impractical to shoot. And he knows that ILM can deliver it.”

Hear more about Severance from Eric Leven and production designer Jeremy Hindle on Lighter Darker: The ILM Podcast.

Clayton Sandell is a Star Wars author and enthusiast, TV storyteller, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell) Bluesky (@claytonsandell.com) or X (@Clayton_Sandell).

On this day in 1975, Industrial Light & Magic was officially signed into existence by George Lucas.

By Lucas O. Seastrom

ILM’s original crew for Star Wars: A New Hope (1977) poses in the front lot of their original studio (Credit: ILM & Lucasfilm).

50 years ago today on May 28, 1975, George Lucas signed a legal certificate issuing his formal shares of stock ownership in a new company: Industrial Light & Magic. It’s likely the founder affixed his signature without pomp or ceremony. There was too much to do. ILM, as it would come to be known for short, had less than two years to build a visual effects studio from scratch and create nearly 400 shots in a new space fantasy film called Star Wars.

By that time in late May, Lucas had hired John Dykstra to supervise the film’s visual effects. The director had an audacious vision for creating dynamic images of spaceships dogfighting with each other. Lucas wanted the camera to move with the ships, as if the camera operators were up there to capture the action by hand. The idea broke many of the traditional rules in visual effects that typically required locked off cameras to allow for separate elements to be carefully blended together.

Visual effects supervisor John Dykstra poses on the stage next to a TIE fighter miniature (Credit: ILM & Lucasfilm).

John Dykstra was practically the only effects artist in Hollywood willing to buy into Lucas’s plans on the existing terms. He’d gained experience with the type of equipment that would be needed to realize the elaborate shots of custom-built miniatures. Dykstra was also a free thinker with a sense of adventure. There were only a handful of effects companies still operating, and none at a major studio. Most balked at the proposal, decrying its limited budget, tight schedule, and seemingly unattainable goals. So Dykstra was tasked with establishing a new operation.

Lucas was a Northern Californian and planned to base the editorial side of post-production near his San Francisco Bay Area home. He wanted to do the same for visual effects. Dykstra argued otherwise, deciding to keep the new facility in Southern California where he had access to a network of talent and close proximity to third party film processing labs. So it was at some point in late May that Dykstra located and then leased a warehouse in Van Nuys, one of a number of towns that sprawled across the San Fernando Valley, a ways north of Hollywood proper, and conveniently removed from the overbearing presence of the established studios. 

Located in an industrial park on Valjean Avenue, just a block from the south end of the Van Nuys Airport, ILM rented a building for $2,300 a month from owner Bill Hanna. It was two stories, made largely of stacked cinder blocks, with a large asphalt lot in front. Inside were a handful of unfurnished offices and open warehouse space with high ceilings ideal for hanging lights. Early on, Dykstra would drive his motorcycle through the building, leaving skid marks on the floor. It was often oppressively hot, even more so once the tungsten film lights were switched on, and Dykstra initially planned to construct a pool onsite, but later compromised with a cold tub that could hold multiple people.

The exterior of ILM’s original studio in Van Nuys, CA. An explosion on the surface of the Death Star is photograped in the foreground (Credit: ILM & Lucasfilm).

“It just popped into my head,” Lucas would recall about the name “Industrial Light & Magic.” “We were sitting in an industrial park and using light to create magic. That’s what they were going to do.”

Initially, Dykstra worked out of Lucasfilm’s offices in a bungalow on the Universal Studios lot, a few minutes drive from Van Nuys. Soon he’d moved to Valjean, working off the floor before furniture was acquired. He was busy recruiting. By early June, modelmakers Grant McCune and Bill and Jamie Shourt were hired, as were production manager Bob Shepherd, technician Jerry Greenwood, first cameraman Richard Edlund, electronics designer Al Miller, and machinists Richard Alexander and Don Trumbull. 

As former Lucasfilm executive editor J.W. Rinzler would note in The Making of Star Wars, “They all knew one another and had worked together before.” They’d worked on feature films with Douglas Trumbull (son of Don), or on commercials and other projects with Robert Abel and Associates. A later group would come from another commercial house, Cascade Pictures. Others came straight from universities where they’d studied everything from animation to industrial design. They brought with them aspects of the culture and methodology from these other places, together making something new and unique.


Before anything else could happen, the Valjean warehouse needed to be converted into production space and workshops. Over six weeks into the summer, they first taped out sections and then constructed the designated areas themselves. On the first floor would be the optical and rotoscope departments, a model shop, machine shop, wood shop, two shooting stages in the rear, and production offices in the front. Upstairs would be home to the animation department, editorial, a screening room, and the art department.

By July, optical composite photography supervisor Robert Blalack and animation and rotoscope supervisor Adam Beckett had been hired, as had a sound recordist and designer who would use ILM’s space as a sometime home base, Ben Burtt. By early August, artist Joe Johnston was setting up the art department (concept artists Colin Cantwell and Ralph McQuarrie had started much earlier, but each worked from home). Within a few months, a dozen people were on board, many of them attracted to join the project out of admiration for George Lucas, whose American Graffiti (1973) had made waves upon its release two years before.


The spaces were ready by mid-summer, but ILM’s work had only just begun. It would take them nearly a year to successfully design and construct an entire visual effects facility and workflow, including miniatures, motion-control camera systems, optical printers, animation cameras, and blue screens. “There’s a significant difference between coming up with a good idea and executing it,” Dykstra would say. ILM’s initial budget was roughly $1.2 million. Although time was of the essence to build the various equipment, distributor 20th Century Fox was slow to provide any initial funds ahead of the main shoot, which would commence in the spring of 1976. So for much of its first year, ILM operated with George Lucas’s personal finances, thanks to the momentous commercial success of American Graffiti

Former ILM general manager Thomas G. Smith would explain in his 1986 book, The Art of Special Effects, how “Outside, it looked like all the other industrial-style buildings in the valley. Inside, it was staffed with very young technicians, some barely out of college, few over 30, some even under 20 years old…. The doors at ILM were open 24 hours a day; technicians and artists worked without regard to time clocks or job classifications. They were children of the ’60s, and many rebelled against authority figures and traditional work rules. There were no dress codes and no specified work hours; designers built models, and modelmakers ran cameras. But there was a strong esprit de corps and feeling of purpose in the building…. The involvement was with the cause rather than with the money; somehow the group felt they were a part of something really important.”


What this group was about to accomplish in less than two years was anything but certain that late spring of 1975. If anything, it was “a long shot,” as Dykstra himself would admit. “It was very, very hard to say specifically what was and what wasn’t going to work before we built it,” he told Cinefantastique in 1977. “So we just had to take a shot at it and all I could do was bluff it and say, ‘Oh yeah, everything’s gonna be fine!’”

As would become the defining element of ILM’s success and endurance, it was the people who made all the difference. “It would be very hard to do Star Wars just by setting up an independent facility unless you had the personnel to do it,” Dykstra said. “The people who designed the equipment and constructed it made it all happen. Not only was it independent of studios but the people who were doing it are the best people in the industry right now.”

What began quietly with a handful of people in a hot, mostly empty warehouse would ultimately do the impossible, not just in the sense of its accomplishments on screen or the resulting accolades, but in its ability to grow, adapt, and continue innovating time and again. That story continues today at the company’s studios around the world. Though ILM has long since outgrown its original warehouse, it still attracts the same intrepid, curious people who bring their passion for image-making and problem-solving to multiple art forms.

Watch ILM’s new celebratory reel in honor of the company’s 50th anniversary:

Lucas O. Seastrom is the editor of ILM.com and a contributing writer and historian for Lucasfilm.

Read more on the ILM.com Newsroom.

Watch Light & Magic on Disney+.

The second, and final, part of an extensive look behind the scenes of the visual effects production for Lucasfilm’s pirate-themed Star Wars adventure series.

By Clayton Sandell

If you missed part one of our deep dive into Star Wars: Skeleton Crew, read it here on ILM.com.

(Credit: ILM & Lucasfilm)

The Observatory Moon

Still searching for At Attin’s coordinates, Jod (Jude Law) and the kids land the Onyx Cinder on the Observatory Moon, seeking help from an alien, owl-like astronomer named Kh’ymm (voiced by Alia Shawkat). The group treks from the ship to the observatory, a striking sequence that includes visuals of the characters silhouetted against a night sky dominated by a nearby planet.

The scenes were all captured in camera on the StageCraft volume, with the actors walking across a practically built dirt mound and the background displayed on the LED screens. “That was another one of our more successful volume shoots,” ILM visual effects supervisor Eddie Pasquarello says. “Perfect use of that, in my opinion.”

The volume also helped create the illusion of the observatory center rotating within the outer walls.

“That one was the most technically challenging,” says ILM virtual production supervisor Chris Balog. “We had to figure out multiple ways of tracking the camera to make sure that the wall was moving in conjunction with it. For some shots they had a circular dolly moving around the set. So we had to make sure that the wall was moving correctly too.”

The volume was used in 1,565 shots in all, Balog says, and 900 of those shots were in-camera finals.

Like Neel (Robert Timothy Smith), Kh’ymm was also realized using a combination of digital and practical techniques, depending on the scene. In some shots, a practical puppet created by Legacy Effects captured her performance entirely in camera. In other scenes, ILM collaborator Important Looking Pirates created a full computer graphics head composited on top of the puppeteered body or utilized a fully digital replica carefully animated to closely match the movements of the puppet.

The episode concludes with the arrival of a pair of familiar New Republic ships summoned by Kh’ymm. “Of course, we see our first X-wings,” Pasquarello smiles. “That was right in our wheelhouse and fun for everyone to do.”

(Credit: ILM & Lucasfilm)

Can’t Say I Remember No At Attin

The Onyx Cinder arrives at a world that initially looks a lot like At Attin but is actually the conflict-battered sister planet At Achrann, a place where children are trained as soldiers in a war between the Troik and Hattan clans. The kids hike through the decayed remains of a neighborhood and city that once looked like their own. Live-action scenes were shot with minimal sets against blue screen backgrounds and completed with extensive environments created by ILM partner DNEG, including dilapidated buildings and streets, a fully-digital armored assault tank, and a small herd of horned eopie creatures.

The heroes are challenged by a Troik warlord named General Strix (Mathieu Kassovitz) to prove their strength in battle. In exchange, Strix’s daughter Hayna (Hala Finley) takes them to an abandoned tower that may have the coordinates they need to get home. Inside the tower – another set that utilized the StageCraft volume – SM-33 (voiced by Nick Frost) reveals that his previous captain ordered him to destroy the coordinates to At Attin. Fern attempts to override his memory, triggering a hostile response and transformation sequence that required significant digital work by ILM’s Sydney studio.

“Whenever SM-33 goes into attack mode, he’s more CG versus the puppeteered, less-docile version of him,” Pasquarello explains. “When he has those armored plates on, or whenever he grows, that’s all CG.”

The abandoned tower set utilized a mix of 3D elements and backgrounds in the volume along with practical columns, floor, and set dressing.

“I thought it had a really amazing photographic feel to it,” says Balog. “Some of the biggest challenges are blending the volume with the real set. And that’s why the virtual art department is such a key factor, because they have to work hand-in-hand with the set department and the 3D content to make sure the textures on everything look the same.”

“ILM had a really great content team led by [visual effects associate supervisor] Dan Lobl, creating content that is believable and looks real,” Balog says. “We’re not successful unless they’re successful.”

The setting also provides subtle foreshadowing to events that unfold inside the At Attin Supervisor’s Tower in episode eight,” says Pasquarello. “The environment was unique and custom,” he explains. “There’s a deliberate tilt up to the ceiling, and you can see some cables hanging. Those are the remnants of their Supervisor, who’s been totally gutted and ripped out. I think it’ll be fun for people to watch the series again, and they’ll understand.”

(Credit: ILM & Lucasfilm)

Lanupa’s Luxury and Peril

Next stop is a mountain on the planet Lanupa, the site of an old pirate lair that SM-33 believes contains At Attin’s coordinates. It’s also the site of a lavish hotel and spa occupied by high-end patrons, including a Hutt who swallows a Troglof mud bath attendant and a massive, tentacled creature called Cthallops, both achieved digitally with the help of Important Looking Pirates.

Jod is captured by the pirate horde and sentenced to death. He’s allowed a few remaining minutes for a final appeal, measured by an hourglass filled with churning blue plasma. “It wasn’t a fully fleshed-out idea on set. We knew we needed an hourglass, and we would be doing it, but it was just kind of a fun adventure to figure out,” Pasquarello says. “We were trying to do some fun ideas with how the plasma would show the passage of time.”

Successfully navigating a series of booby traps, the children, Jod, and SM-33 enter the subterranean treasure lair of pirate captain Tak Rennod, another set that relied heavily on the StageCraft volume.

“They built the big skull throne that the pirate king sat on,” says Balog. “They had all the treasure in the room, four big columns, and the stairs and the rock when they walked in. Everything else in the cave was created with the volume.”

After finally discovering At Attin’s secret, as well as its location, Jod betrays the children, who escape the lair by sliding down a series of tunnels to the base of the mountain. As Wim (Ravi Cabot-Conyers), Neel, Fern (Ryan Kiera Armstrong), and KB (Kyriana Kratter) figure out how to get back to the Onyx Cinder, they encounter a cast of curious trash crabs.

“They’re not droids,” explains Pasquarello. “They’re literally crabs with garbage on their backs. And that was a lot of work to make that understandable. They’re not synthetic. It’s one of those sequences that is very rich in detail, and there’s a lot going on.”

While the baby crabs are digital, a massive mama crab was created as a detailed stop-motion puppet by Tippett Studio, the production company founded by original Star Wars animator and creature designer Phil Tippett. The beast’s jagged, rusty, junk-laden look prompted the Tippett crew to nickname it “Tet’niss.”

(Credit: ILM & Lucasfilm)

“We generally did the rough blocking of the shots at ILM first,” production visual effects supervisor John Knoll explains. “We figured out what the shots wanted to be, the pace, and how big the creature was going to be. Once we got all those layouts approved, it went to Tippett’s, including all the camera info so they could figure out where the camera was positioned relative to the set and the puppet.”

A low-resolution, untextured 3D model of the mama crab also helped animators work out the creature’s speed and movement in advance of shooting on the stop-motion stage.

“Since stop-motion is very labor intensive, you don’t want to have to go back and reshoot things,” Knoll says. “So we got approval on their preliminary animation, and then they would go in and do the detailed stop-motion. And that was a particularly complicated character because there are so many moving parts on it. Obviously, there are the eight legs, but then there are all kinds of little pieces on it that bounce and move when it starts to walk. I’m impressed that they were able to keep that all straight in their heads.”

The mama crab puppet weighed in at about 15 pounds, requiring support from a mechanical harness that was digitally erased in post-production.

(Credit: ILM & Lucasfilm)

Onyx Cinder Metamorphosis

The kids reach the Onyx Cinder just as an enormous scrapper barge closes in, threatening to pulverize the ship and ingest the remains into its fiery maw. “There’s sort of a tug-of-war between the ship and this garbage muncher,” Knoll explains. When the ship is snagged by one of the muncher’s claw-like arms, Fern decides their only hope for an escape is by triggering the emergency hull demolition sequencer.

A series of rapid explosions ripple down the hull, causing the Onyx Cinder to shed its worn outer shell. A smaller, silver-colored version of the ship is freed and rises out of the debris. “Our code was the ‘ironclad’ and the ‘sleek ship,’” Pasquarello says of the two Onyx Cinder variants. “We went around a lot with the shedding of the hull. We didn’t want it to all blow off and just be conveniently revealed. It had to come off like a snake’s skin.

“And the effects are just dialed up to 11,” continues Pasquarello, who hopes that fans notice a key storytelling detail of the ship’s metamorphosis. “One cool thing that I don’t think everybody knows is that when you transition between the ships, we don’t share all the same engines, but the engines that we do share between the ships change from a warm color to blue.

“One of our challenges was that the sleek Onyx Cinder is a cleaner-burning ship,” Pasquarello says. “The whole conceit was that the engines were that orange color because they were dirty and running bad oil. We kept debating: ‘When would it turn blue?’ The sequence is a very elegant transition shot where you see it sputtering away all of that oil and dirt to the cleaner burning blue that we got.”

Knoll says the transformation was one of the more “complicated” scenes to pull off. “There are a lot of simulation layers that are in there, and the sleek ship doesn’t actually fit inside the armored hull, so there was some sleight of hand that had to happen to make that appear to work,” he explains.

The end result is one of Pasquarello’s favorite sequences. “Every time I watch it, I still get chills,” he says. “It just speaks to the detail that the creators had about this show. They thought of everything. [Jon] Watts was very clear with us that this is why this is happening. And we just had to figure out how to execute that.”

(Credit: ILM & Lucasfilm)

The Return to At Attin

At Attin’s coordinates in hand, Wim, Fern, KB, and Neel arrive at their home planet aboard the transformed Onyx Cinder. A horde of pirates led by Captain Brutus (portrayed by Fred Tatasciore and performance artist Stephen Oyoung) are not far behind. But the pirates are stopped by the planet’s protective, nebula-like barrier. “Going through the barrier for us was a really big endeavor,” says Pasquarello. “It’s something that started early because it’s so effects-driven and heavy and large scale, and there’s a lot of story to be told in there.”

An array of satellites protect At Attin, blasting deadly arcs of lightning toward unauthorized ships. SM-33 reveals the Onyx Cinder is an At Attin vessel, which allows it to pass safely. The design and function of the satellites – crafted by ILM’s digital modeling department – evolved over time, says Pasquarello. “At one point, the satellites were actually emitting atmosphere. There were versions where you could literally see atmosphere coming out of them to create that cloudy environment,” he explains.

Pirate ships pursue the Onyx Cinder through a toxic swirl of greenish-blue gasses but are destroyed by the satellites. “There’s a lot of heavy, heavy sims [simulations] and work that went into that sequence, and then the landing on At Attin,” Pasquarello says, giving credit to ILM’s compositing and effects teams.

One element featured in the return to At Attin came along late in the production process. With shot delivery deadlines approaching faster than a ship in hyperspace, John Knoll got an email from Jon Watts. “He said, ‘We’ve done animatronic creatures, we’ve done rubber monsters, we’ve done stop-motion creatures. We did miniature and motion control. The only thing we haven’t really done from the old days is a traditionally-painted matte painting. Is it too late to do one?’” Knoll recalls.

With only two months to make it happen, Knoll reached out to former ILM artist Jett Green at her home in Hawaii and asked if she’d like to put her brushes to work creating a traditional oil matte painting of At Attin. Using paint instead of pixels to compose a matte image is something ILM hadn’t done in about 30 years, according to Knoll.

Green – with a long list of credits as a traditional matte painter on films including Indiana Jones and the Temple of Doom (1984), Labyrinth (1986), and Willow (1988) – says she was honored to be asked.

“I love being part of this history,” Green tells ILM.com. “John and I had this conversation about it being a planet. He had the references already, and he told me what he needed. I even built the Masonite panel for it, and it just felt really good.” Knoll now has the roughly six-by-two-foot painting displayed in his ILM office.

At Attin matte painting created by Jett Green (Credit: ILM & Lucasfilm)

Another ILM veteran, modelmaker Bill George, is also credited on Skeleton Crew. George first worked for ILM building models for Star Trek II: The Wrath of Kahn (1982). For fun, he once built a mashup of two similar ship designs: the concept for Han Solo’s original “pirate ship” from Star Wars: A New Hope (1977), and the Eagle from the sci-fi series Space 1999 (1975-1977). He called it the Millennium Eagle.

“Somebody at Lucasfilm saw it,” George says. “I got a request saying, ‘Hey, can you bring that model in? We want to scan it.’”

The computer graphics version of George’s Millennium Eagle model now appears among the ships docked at Port Borgo.

It’s not the first time one of George’s homemade models ended up in a galaxy far, far away. During production of Star Wars: Return of the Jedi (1983) ILM was in desperate need of a new Y-wing model. George offered to bring in the one he’d built years earlier. It was so good, it ended up being used in the film.

Posing as an emissary from the New Republic, Jod gains access to At Attin’s bountiful treasure: 1,139 subterranean, credit-filled vaults. The vault is an entirely digital environment that DNEG populated with security droids, industrial robotic arms, and a seemingly endless supply of golden computer graphics credits that line the walls and spill into Jod’s rapacious hands.

Jod, Fern, and her mother, Fara (Kerry Condon), ascend the Supervisor’s Tower. The Supervisor is revealed to be a large, domed droid with a red eye. Only a small part of the Supervisor droid was constructed physically, with the StageCraft volume completing the illusion.

“Virtual production is the future of visual effects,” says Chris Balog, a 20-year ILM veteran with a background as a digital compositing artist. “It’s where the next evolution is going. And if you can do it successfully, it’s an amazing tool.”

Jod destroys the Supervisor with a lightsaber, triggering a citywide power outage and disabling the barrier satellites, clearing the way for the massive pirate frigate to reach the planet’s surface. 

The enormous frigate survives the barrier and floats ominously over the city. “The great effects work done with the frigate coming through the clouds was Travis Harkleroad, our effects supervisor,” Pasquarello says. “Those explosions all come from him and his people.”

The all-computer graphics frigate’s arrival is meant to evoke the alien-arrival feeling of films like Close Encounters of the Third Kind (1977) and Independence Day (1996). “There was no practical frigate,” Pasquarello says. “It’s a gorgeous ship. It’s a very complex-looking ship, and there’s a lower and upper deck that was built inside, and ships and skiffs that come out of that.”

Wim, Neel, and KB devise a two-part plan to rescue Fern and call Kh’ymm for help. Jumping on speeder bikes and pursued by skiffs loaded with angry pirates, the kids – along with Wim’s dad, Wendel (Tunde Adebimpe), make their way across the city.

For close-up shots, the actors were shot on a blue screen stage, with the more dangerous action – like a perilous jump across a canyon – requiring the use of digital doubles. “The speeder bikes on this show were a real challenge in the sense that we can’t put kids into a lot of heavy stunt work,” says Pasquarello. “So there was a lot of work done to help the dynamics and the physics of that chase.”

The action continues through an all-computer graphics forest, through the city, to the school. Pasquarello praises ILM’s animation, layout, simulation, and environments teams for the extensive 3D build. “They’re going through an entirely CG environment, created by the environments team that you just don’t question,” he says. “Not one thing that they fly over or go through is real.”

Summoned by Kh’ymm, New Republic forces arrive at At Attin, attacking the pirate frigate and saving the day. The squadron of X-wings is backed up by B-wings, another fan-favorite fighter that first appeared in Star Wars: Return of the Jedi (1983) and later in Star Wars: Rebels (2014-2018).

“The B-wings were a favorite of mine as a kid, so I did my best to try to get them featured in some big, heroic moments,” says ILM animation supervisor Shawn Kelly. “Initially, we had them dropping bombs on the pirate ship, but [Lucasfilm chief creative officer] Dave Filoni had the great suggestion to try the ‘composite laser’ weapon. Honestly, I had no idea what that was at first. As soon as the meeting was over, I looked it up and realized it’s the ridiculously cool quadruple-beam attack seen in Rebels. I got so excited by the idea that I stayed up late and designed a new shot that could really show off that attack. I felt like I was 10 years old again, playing with my B-wing toy in the backyard!”

Balog would composite the B-wing shot himself, working in collaboration with the FX team to evoke the feeling of the laser as seen in Rebels, but with a more realistic style appropriate for live-action.

The battle-wounded pirate frigate makes a spectacular crash landing, a completely computer graphics sequence that Pasquarello says was carefully designed to depict minimal casualties. “The conceit is that everyone’s been rounded up to a specific space, so we know that everybody evacuated,” he explains. “You notice it doesn’t really tear into buildings as the frigate crashes; it’s just pulling up the street and abandoned cars. It crashes gently into the waterway.”

(Credit: ILM & Lucasfilm)

Galactic Global Effort

Bringing Skeleton Crew to life with its creative mix of old and new took a tremendous amount of effort from artists around the globe. “I worked with a team of 50 animators that were in San Francisco, Vancouver, Singapore, Sydney, and Mumbai,” says Pasquarello. “A big team. We’re one big happy family; we’re all working together to bring these characters to life.”

Knoll and veteran ILM modelmaker John Goodson say they feel lucky to still be bringing old-school ILM effects expertise to new productions. “You know, there’s only a few of us that still know how to do this stuff,” says Knoll. “And part of this for me was, I want to bring some younger people who are exposed to what we’re doing, who are trained up to use the gear so that when I’m not available to do this stuff there are people who know how to do it.”

“We both came here because we wanted to shoot spaceship models,” Goodson adds. “And we’re still getting this opportunity. It’s a phenomenal experience to be able to do this, to take advantage of some of the newer technologies, and revisit this stuff from our past, which is the reason we got in the business to begin with.”

For Shawn Kelly, a 28-year ILM veteran, working on Skeleton Crew was a career highlight. “Our review sessions on this project were by far the most enjoyable, fun, collaborative,” he says. “Watts and Ford are awesome. They have tons of great ideas. They’re really collaborative and open to ideas. It felt like a family just trying to make the best thing we can make all together.”

(Credit: ILM & Lucasfilm)

Clayton Sandell is a Star Wars author and enthusiast, TV storyteller, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell) Bluesky (@claytonsandell.com) or X (@Clayton_Sandell).

Muren celebrates ILM’s 50th anniversary at the place where it all began.

By Clayton Sandell

Dennis Muren was there at the beginning.

As one of the first employees of George Lucas’s fledgling visual effects company, Industrial Light & Magic, Muren spent many long and intense hours working inside a nondescript industrial building in Van Nuys, California, helping bring the director’s Star Wars vision to life.

Through their groundbreaking effects work on Star Wars: A New Hope (1977), Muren and his colleagues pioneered modern filmmaking from this former warehouse on Valjean Avenue, not far from a noisy airport.

“It was a long time ago, but I remember everybody. All the people and making the film and the excitement of it not being a Hollywood movie – not a home movie – but it was a big movie,” Muren tells ILM.com. “And we were all on the same team working to get it done.”

Dennis Muren at work on Star Wars: A New Hope at ILM’s original studio in Van Nuys, California (Credit: ILM & Lucasfilm).

For the first time in about 50 years, the nine-time Academy Award winner recently came back for a tour of ILM’s original home to help celebrate the company’s golden anniversary. A lot has changed.

“There’s a wall here I don’t even remember being there, dividing the two parts,” Muren points as he looks around the warehouse floor, a few steps from where the ILM model shop was set up back in the day.

The floor plan of the building today – now home to a commercial sign company – is roughly the same as it was in 1975. As Muren walks the halls with his wife Zara, one second-floor room in particular brings back a galaxy of emotions.

“That’s very memorable. Going back to the screening room,” Muren says. “It just brought back a flood of memories of the dailies. George would often come to the dailies, and he’d be looking at the shots over and over and deciding what’s going to work and what needs to be redone.”

Muren is also reminded of the stress that faced the ILM crew as they rushed to finish the visual effects shots on time.

“‘Are we going to get the show done on time?’” he remembers being a frequent worry. “We’d go over the storyboards there too, and the schedule was on the wall of the dailies room. We would say, ‘Look, we’ve got to get these shots this week or else we’re in trouble.’”

The interior of ILM’s original facility, now a commercial sign company (Credit: Clayton Sandell).

After his tour, Muren signs autographs and poses for pictures with fans who gather to sing “Happy Birthday” to ILM. He blows out candles on a special Darth Vader cake before introducing a screening of A New Hope for an audience seated in the same parking lot where some of the film’s most iconic shots were filmed.

“Right here, [ILM modelmaker] Steve Gawley’s pickup truck would race by as fast as it could go, with [miniature and optical effects cameraman] Richard Edlund on the back of it with a VistaVision camera shooting the [surface of the]  Death Star as pyro was blowing up,” Muren tells the crowd.

“That was a typical day,” he smiles.

Muren attended the celebration over Star Wars Day weekend as a guest of the event organizers, My Valley Pass and On Location with Jared Cowan, a podcast hosted by movie location expert Jared Cowan.

Dennis Muren greets fans in the original ILM facility’s parking lot, where some effects shots were created (Credit: Clayton Sandell).

For more on ILM’s early history and the creative geniuses that changed moviemaking forever, check out Light & Magic, a two-season, nine-part documentary series now streaming on Disney+.

Listen to Dennis Muren’s interview from On Location with Jared Cowan.

Clayton Sandell is a Star Wars author and enthusiast, TV storyteller, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell) Bluesky (@claytonsandell.com) or X (@Clayton_Sandell).

By Patrick Doyle

Fans from all over the world gathered in Japan from April 18-20 for Star Wars Celebration 2025. In honor of this monumental event, ILM and Meta shared a special first look at their upcoming virtual and mixed reality experience Star Wars: Beyond Victory – A Mixed Reality Playset, currently in development for Meta Quest headsets.

Additionally, demos for award-winning titles Vader Immortal: A Star Wars VR Series and Star Wars: Tales from the Galaxy’s Edge were available at the booth.

Here’s a recap of everything that went down at the ILM & Meta booth.

The Booth: Step Inside the Star Wars Galaxy

The ILM & Meta booth was a sensory playground inside the Makuhari Messe convention center. Massive screens played cinematic trailers, customized demo pods housed players during their sessions, and photo walls for each title were available for fans to snap a pic after going through the experience of their choice. With wait times of up to three hours, the booth also offered unique fixtures, like an interactive button wall, and a crew of knowledgeable staff were around to answer questions and help fans prepare for the adventures ahead.

Creative director Joe Perez III and executive producer Alyssa Finley.

To help set the tone, key representatives from the ILM team behind Star Wars: Beyond Victory, Jose Perez III, creative director, and Alyssa Finley, executive producer, were onsite to guide fans through the experiences and talk about the creative inspirations behind each title.

The Experiences: Three Unique Star Wars Adventures

The ILM & Meta booth brought three unique experiences to Celebration, each telling their own story within the Star Wars galaxy:

Connect with new and beloved Star Wars characters in a thrilling and creative experience. Through virtual and mixed reality, fans will get to adventure, race, and play in three modes. In development now.

A three-part series that combines immersive cinematic storytelling with dramatic interactive play. Explore the world of Darth Vader and complete your journey to determine Mustafar’s fate. Available now.

Experience action in the Batuu wilds with Star Wars: Tales from the Galaxy’s Edge and the Last Call add-on. Fight alongside classic characters like R2-D2 and C-3PO and take on unexpected alliances and deadly enemies. Available now.

The Comic: An Original Story Written by Ethan Sacks

As a special bonus to the fans at Celebration, an exclusive Marvel comic was available at the booth. Written by Ethan Sacks, it tells an original story about Volo Bolus before the events of Star Wars: Beyond Victory.

Sacks and interior illustrators Steven Cummings & Shogo Aoki also stopped by the booth to meet fans and sign copies of the comic during the show.

Beyond the Booth

During Celebration, Alyssa and Jose were honored to go up on the Star Wars Celebration LIVE! Stage to talk about Star Wars: Beyond Victory, the ILM and Meta booth and, of course, to throw some t-shirts out to the audience. 

The crew was also able to attend several incredible panels during the show including Light & Magic Season 2, Fifty Years of Magic: Celebrating the Legacy of Industrial Light & Magic, and Lucasfilm Publishing: Stories from a Galaxy Far, Far Away…

The Force was Strong with this Booth

With long lines, enthusiastic crowds and countless fans coming out thrilled/terrified to have seen Darth Vader up close or Sebulba atop a podracer after all these years, it’s clear that the virtual and mixed reality mediums offer completely new ways for fans to experience Star Wars storytelling.

Whether you were honing your lightsaber skills in the dojo, tossing some repulsor darts at Seezelslak’s Cantina or testing your wits as a podracer, this booth offered exactly what we came to Celebration to do – showcase a different way to experience the galaxy we all love.

We’ll have more information to share on Star Wars: Beyond Victory – A Mixed Reality Playset at a later date and, in honor of May the 4th, Vader Immortal: A Star Wars VR Series and Star Wars: Tales from the Galaxy’s Edge are on sale for 66% off from now until 11:59PM PT on Monday, May 5 at the Meta Store and from Friday, May 2 until 11:59PM PT on Monday, May 5 on the PlayStation Store.

We can’t thank all the fans enough for making this a Celebration to remember and we hope to see you all in Los Angeles in 2027!

Wishlist Star Wars: Beyond Victory – A Mixed Reality Playset now, and watch the ILM.com Newsroom for the latest updates. Visit ILM.com/Immersive to learn more.

Light & Magic Season 2 is streaming now on Disney+.

New apparel and a tumbler celebrating the 50th anniversary of Industrial Light & Magic are now available on Amazon.com.

Patrick Doyle is a senior publicity manager at Industrial Light & Magic.

The Lord of the Rings: The Rings of Power wins for Special, Visual & Graphic Effects in Season 2 of the Amazon MGM Studios series.

This past weekend, the British Academy of Film and Television Arts hosted the 2025 BAFTA Television Craft Awards, where The Lord of the Rings: The Rings of Power won for Special, Visual & Graphic Effects. ILM’s Jason Smith, who served as production visual effects supervisor, received the award alongside his collaborators Richard Bain, Ryan Conder, and Chris Rodgers. Watch their acceptance speech below:

ILM teams in London, Sydney, and the former studio in Singapore delivered over 500 visual effects shots to Rings of Power Season 2. Hubbed in London, the effort was led by ILM visual effects supervisor Daniele Bigi, visual effects producer Christine Lemon, and visual effects executive producer Lee Briggs. 

Congratulations to Jason and our ILM crew! 

Read more about ILM’s work on Rings of Power Season 2 right here on ILM.com.

Clothing and accessories featuring a new commemorative logo designed by Hoodzpah are available for purchase on Amazon.com.

By Mark Newbold

Actor Sam Witwer sports a new ILM 50th t-shirt at Star Wars Celebration Japan (Credit: Wes Ellis).

In a world of innovation, skill, and ingenuity, no company has shone as bright or lasted as long in its field as Industrial Light & Magic. First incorporated in May 1975, ILM has led the way in the realm of visual effects for half a century. This iconic brand is as much a marque of quality as “Music by John Williams,” “Conceptual Design by Ralph McQuarrie,” or “Directed by George Lucas.”

To celebrate the 50th anniversary (a first for any visual effects company) Hoodzpahthe team behind the ILM logo redesigns in 2023 — were asked to adapt their work for a fresh new ILM 50th logo, which is featured in a line of new merchandise recently unveiled at Star Wars Celebration Japan and now available on Amazon.com.

ILM.com had the opportunity to chat with the team about this exciting new project and how they decided on the tone for the 50th anniversary logo.

“When you work with a storied company like ILM, there is a wealth of visual inspiration and history to reference,” explains the Hoodzpah team, “so the hard part is narrowing in on one vision when there are so many ways you could take it. We cast a wide net and tried many different directions before landing on this retro-modern celebration that feels quintessentially ILM.”

ILM has an incredible history, not only with its groundbreaking work on-screen but also its branding, going back to the classic Michael Pangrazio-designed magician logo illustrated by Drew Struzan and through a variety of changes to today. Given that lineage, Hoozpah decided on the mix between the 2023 redesign and the ’70s-style piping in the new logo, a blend that marries ILM’s ’70s vintage with the modernity of the current branding.

“With an anniversary logo, you’re trying to balance two things: celebrating the history and accomplishments and legacy, but also reminding folks that there’s always more horizon to conquer,” Hoodzpah says. “This is just the first 50 years, and there’s so much more to come. Since the execution of the primary logo icon feels modern and intrepid, we wanted to embrace a ’70s vibe from the early ILM days. It felt so right as a nod to where it all started.”

In the world of marketing, there are numerous rules and tricks to designing a great logo that catches the eye and sits in the memory. With ILM and all the history that goes with it, there remains a need to find the right focus for such an emblem.

“When a logo really resonates, it’s because it feels true to the brand,” notes the Hoodzpah team. “There are so many styles and means of execution, but the question should always be, ‘What feels right for this brand?’ People love to look at trademark logo books with hundreds of logo icons shown on a white page. It’s inspiring to see all the styles of execution. But we’re always left wondering, ‘What’s the context?’ It’s not about a logo looking good in isolation because one is rarely used that way. It’s about a logo feeling perfect in context. It was the same for this project. We tested the 50th anniversary logo in key applications and then used it in a suite of anniversary merch designs as well.”

Collaboration is key in everything ILM does, from the core team pulling together on new projects to working hand-in-glove with vendors and creatives behind the films, TV shows, and immersive entertainment projects ILM works with. Given that, it was important for Hoodzpah to spitball ideas with the ILM team themselves because clearly they revel in the spirit of collaboration in the same way that ILM does.

“There’s a reason we didn’t end up choosing fine arts as a career path, even though we really loved it,” Hoodzpah explains. “We like working within the limitations of a prompt and pushing and flexing boundaries to see how far we can take it. Design is a team sport. We all get together and try to push this idea up the field. When we work with ILM, we are keenly aware that everyone we work with is a creative powerhouse in their own right. We’d be fools not to tap into that ‘creative brain trust,’ as [director of PR and communications] Greg Grusby calls it, and gather as many ideas and as much feedback as we can to make this logo as true to the ILM legacy as possible. After all, the people of ILM make ILM what it is. It’s like, why would we want just one violin when we could work with a full symphony orchestra?”

The work on the logo continued with the creation of distinct products now available in the new merchandise line.

“Taking the logo and spinning it off into 50th anniversary merch was so much fun,” Hoodzpah says. “The ILM crew were so game to dream big and really have fun with it. Each piece leans into a different vein of the ILM personality. We have a retro ’70s poster of a magician conjuring new worlds, which is what ILMers do every day. We celebrated all the innovation and milestones ILM has accomplished over 50 years in an infographic T-shirt. We even made custom-scented candles to celebrate the different departments and locations over the years. Our favorite is the Model Shop candle which has notes of sawdust and cedar. There’s truly something for everyone.”

With their work on the ILM logo, Hoozpah has become a key part of ILM’s identity and history, which makes the team proud.

“Getting to work with a cultural icon like ILM once was incredible,” they conclude. “Being trusted by such talented creatives to work with them again was even better. It’s great to be able to pick up where we left off, having already become embedded with the team and learning so much about the brand in our last project. Working on the rebrand was one of those bucket list jobs you continually remind yourself, ‘Wow, we really got to be a part of that.’ It felt like getting the band back together for the sequel.”

New apparel and a tumbler celebrating the 50th anniversary of Industrial Light & Magic are now available on Amazon.com.

Read more about ILM’s 50th anniversary, including a newly announced book, on ILM.com.

Light & Magic 
Season 2 is streaming now on Disney+
.

Mark Newbold has contributed to Star Wars Insider magazine since 2006, is a 4-time Star Wars Celebration stage host, avid podcaster, and the Editor-in-Chief of FanthaTracks.com. Online since 1996. You can find this Hoopy frood online @Prefect_Timing.

Industrial Light & Magic: 50 Years of Innovation by Ian Failes will be released in January 2026 by Lucasfilm Publishing and Abrams.

By Lucas O. Seastrom

It all began in May of 1975 with a handshake between director George Lucas and visual effects supervisor John Dykstra. Industrial Light & Magic formed as Lucasfilm’s visual effects division to work specifically on one project: Star Wars: A New Hope (1977). 50 years later, ILM now spans the globe with studios in five countries and hundreds of productions to its credit.

Now in 2025, the 50th anniversary festivities have kicked off at an appropriate venue: Star Wars Celebration. ILM leadership and artists gathered at the beloved fan event near Tokyo, Japan to reflect on the storied occasion, as well as announce a new book: Industrial Light & Magic: 50 Years of Innovation, written by Ian Failes and coming January 2026 from Lucasfilm Publishing and Abrams.

A New Book Charting ILM’s Continuing Legacy

Industrial Light & Magic: 50 Years of Innovation takes readers from day one at ILM in 1975 up to some of the latest projects and innovations at the company today. Packed with hundreds of rare archival images, author Ian Failes – the noted visual effects journalist at befores & afters – weaves insightful technical history with the beloved stories of ILM’s people. 

“ILM has been part of my visual effects life for a long time,” Failes tells ILM.com. “I first ‘discovered’ so much about visual effects just as I left high school when I happened upon two things…. One was the industry magazine Cinefex, and the other was the incredible book, Industrial Light & Magic: Into the Digital Realm, by Mark Cotta Vaz and Patricia Rose Duignan. I read that ILM book from cover to cover multiple times. It really was one of the things that inspired me to become a visual effects journalist.

“So, getting the opportunity to go deeper into ILM’s history with this new book, but now with all the knowledge I’ve gained from time spent covering the industry, is just so rewarding—and fun,” Failes adds.

Readers can look forward to many untold stories in 50 Years of Innovation. Failes identifies the transition from photochemical optical compositing to digital methods as a particularly fascinating era in the company’s history. “In the book there are some great details shared by key ILMers who were there at the time about many different aspects of the move to digital in terms of other areas like film scanning and digital compositing,” the author says.

“Also, readers have never been able to explore so many exclusive behind-the-scenes photos from ILM’s history before,” Failes continues. “Having images from all different fields that highlight what is essentially the history of visual effects like modelmaking, optical effects, puppets, stop-motion, matte paintings, hand-animation, CG animation, virtual production, etc., all in one place, is something very special. I especially love some of the photographs that showcase the various VistaVision and motion control camera systems that ILM developed.”

At the heart of ILM’s story is the spirit of creativity and innovation which has been defined by the company’s people over the decades. “Even back to its beginnings, George Lucas started ILM after identifying that no existing facility could deliver what he imagined for Star Wars,” Failes concludes. “It feels to me that a unique innovative spirit was born during the making of that first film, and never left the company. I think that goes both for technological developments and also cultural ones. ILM helped establish modern workflows inside a visual effects facility, and I think, really importantly, further set the standard for how to collaborate with filmmakers and other creatives.”

Industrial Light & Magic: 50 Years of Innovation by Ian Failes is coming January 2026 from Lucasfilm Publishing and Abrams.

On the Stage at Star Wars Celebration

As a special live recording of Lighter Darker: The ILM Podcast, the Star Wars Celebration panel included president and general manager of Lucasfilm business, Lynwen Brennan; head of ILM and general manager Janet Lewin; ILM executive creative director and senior visual effects supervisor John Knoll, ILM Sydney creative director and senior animation supervisor Rob Coleman; ILM lead CG modeler Masa Narita; and former ILM modelmaker Fon Davis. Lucasfilm’s senior vice president of creative innovation, digital production & technology Rob Bredow moderated.

Lynwen Brennan came to ILM 27 years ago as the company ramped up for production of Star Wars: The Phantom Menace (1999). Like many, she’d been inspired to join ILM after seeing Jurassic Park (1993) a few years earlier. “The minute I walked through the door, I just fell in love,” Brennan told the audience. “I knew I’d found my people…. It’s an incredibly spirited place. We have a lot of fun. There’s something so special about a place that attracts these mavericks who are not scared of doing anything new…. Sometimes when you find people who are real risk-takers, they’re not necessarily great team players, right? But this…is a place where you’ve got people who love taking those risks but do it in such a collaborative way. That’s a thing that really got me.”

Janet Lewin started her ILM career some 30 years ago and has had a front row seat to the continuing changes and evolutions in the visual effects industry, much of it driven by ILM. “Back then, we were one studio in San Rafael, just a couple of hundred people, mostly working in the Model Shop and on the stage,” Lewin explained. “It was an exciting time right at that digital revolution. It was a big deal for us to juggle four shows at one time, and a big show was a couple hundred shots. And over my 30-year trajectory, the company has massively grown. We now have 3,500 employees, five global studios, and…we do visual effects work across every possible medium.”

For Masa Narita, appearing onstage at Celebration in his native Japan was a full circle moment. A lifelong visual effects fan, he’d watched Star Wars as a teenager during its original Japanese release in 1978. But as he reached adulthood, Narita first chose a career in finance. 

“I used to be a businessman, worked for a Japanese brokerage firm for over 20 years,” Narita said. “But I always loved movies and visual effects because I grew up with special effects pioneers like Ultraman and Godzilla. So my first childhood dream was to wear a kaiju suit and to smash miniature towns. Actually, I still want to do that. [laughs] As I got older, I realized that I wanted to create something special like spaceships and characters [that] I saw in the movies. So at the age of 45, I decided to follow my passion. I quit my financial job and moved to Hollywood and started at a CG school. So that was my biggest gamble in my life, taken with my loving wife and two children. Fortunately, one year later…I got [my] very first CG job, and eventually I came to my dream company, ILM.”

Narita has since worked at the company for over a decade, contributing to productions like Solo: A Star Wars Story (2018), The Mandalorian (2019-23), Indiana Jones and the Dial of Destiny (2023), and Deadpool & Wolverine (2024).

“ILM puts a lot of focus on innovation that makes the impossible possible,” Narita added. “So I feel inspired every day walking in a place with so much creativity and skill. I love what I’m doing and I feel I really achieved my dream. People say life is short, but I don’t think so. We have plenty of time to start over. It’s never too late to chase something new.”

Onstage at Star Wars Celebration Japan, L to R: Fon Davis, Masa Narita, Rob Coleman, John Knoll, Janet Lewin, Lynwen Brennan, and Rob Bredow (Credit: ILM).

You can hear these stories and many more on Episode 17 of Lighter Darker: The ILM Podcast.

Watch the ILM.com Newsroom for the latest information about how you can purchase a copy of Industrial Light & Magic: 50 Years of Innovation, coming to bookstores everywhere January 2026.

Light & Magic Season 2 is streaming now on Disney+.

New apparel and a tumbler celebrating the 50th anniversary of Industrial Light & Magic are now available on Amazon.com.

Lucas O. Seastrom is the editor of ILM.com and a contributing writer and historian for Lucasfilm.

The ILM veteran and accomplished feature filmmaker enters the documentary space to tell the story of ILM and Lucasfilm’s digital filmmaking odyssey.

By Lucas O. Seastrom

Warning: This article contains spoilers from Light & Magic Season 2

Among the first group hired at Industrial Light & Magic in 1975, Joe Johnston began his career as a storyboard artist and concept designer. After 10 years with ILM on three Star Wars and two Indiana Jones films, among others, he went to the University of Southern California film school under George Lucas’ sponsorship. He’d go on to direct classics as varied as Honey, I Shrunk the Kids (1989), October Sky (1999), and Captain America: The First Avenger (2011). 

Johnston’s directorial debut in the documentary medium, however, comes today, with the Season 2 premiere of Light & Magic on Disney+. The non-fiction series charts the storied legacy of Industrial Light & Magic, now celebrating its 50th anniversary, an unprecedented achievement in the history of visual effects.

“I don’t have any experience in documentary or non-fiction filmmaking,” Johnston tells ILM.com. “When I was at Cal State Long Beach, I worked on a documentary that was directed by Tony Brennan called Hitler’s Secret Weapon. It was about the V2 rocket. Basically, my job was to do illustrations that explained some of the ideas he was trying to get across. That was my entire experience with documentary filmmaking, almost nil.”

But Johnston does have experience as a storyteller. “While I had never worked on a documentary, I had a pretty good idea of how to tell a story, whether it’s real or fictional,” he says. “And you have to remember, especially with a project like this, though it’s true of all filmmaking, I had so much help. I had a supervising producer [Nicole Pusateri], story producer [Carly Baggett], a line producer [Andrew Hafnor], three great editors [Mike Long, Jennifer McGarrity, and Robinson Eng], and an archivist [Eugen Bräunig] whose job it was to go through thousands of hours of footage from ILM. It was more like a steering process, and I steered that process toward an ultimate goal. It was a real team effort all the way through.”

Finding the Story

After a successful first season directed by veteran screenwriter Lawrence Kasdan, Lucasfilm and Imagine Entertainment agreed to produce a second. It was then that Imagine producer Christopher St. John gave Johnston a call. The latter was surprised by the inquiry, thinking they wanted him to appear in Season 2 as an interview subject. “I said, ‘Guys, I’ve said everything I have to say about it in Season 1.’ And Chris said, ‘No, no, we want you to direct it.’ Well, okay. I had to think about that for a while. It sort of came out of nowhere. I wasn’t expecting it.”

Johnston’s relatively distinct point-of-view helped motivate him to accept the offer. “Having been an insider for the first 10 years during the original Star Wars trilogy, maybe I could have a unique perspective on what Season 2 might look like, having not been around for any of that. I left in 1985, came back for a couple of projects afterwards, but the whole shift toward digital was all new to me. Once I was onboard, it was a matter of guiding it in the direction I thought it should, one goal of which was to tell George Lucas’ story as much as possible.”

That story emerged as Johnston and team reviewed thousands of hours of archival footage preserved in ILM’s collection. “I recognized that one of the stories that needed to be told was how George Lucas had basically steered the entire motion picture industry – in a way he sort of dragged it kicking and screaming – into the digital age,” the director explains. “That was a story that I didn’t think had really been told before. Here was a chance to feature that aspect of ILM and Lucasfilm.” 

This would chiefly center around the production of the Star Wars prequel trilogy, released between 1999 and 2005. The first entry, Star Wars: The Phantom Menace (1999), was the most ambitious visual effects project ever undertaken up to that time, counting more than 2,000 shots produced entirely within ILM. The middle entry, Star Wars: Attack of the Clones (2002), was the first blockbuster feature film made in a completely digital format and workflow. Surrounding these Lucasfilm productions were a bevy of groundbreaking achievements for client productions as varied as environmental effects in Twister (1996) and The Perfect Storm (2000) to performance capture in The Pirates of Caribbean trilogy (2003-07) and a fully-animated feature with Rango (2011).

Master Yoda first appeared as an all-digital character in Star Wars: Attack of the Clones (Credit: Lucasfilm & ILM).

Always a Student

“What also appealed to me was the chance to interview these people, a lot of whom I’d known over the years, but hadn’t worked with,” Johnston adds. “Hearing their personal stories…. It was an education for me. I don’t know that much about visual effects, so it was interesting to learn how effects had evolved since my involvement in the 1980s.”

Indeed, Johnston is keen to note that, although he’s had a reputation “as a visual effects person, I have to always remind people that I’m not at all. I was a designer, storyboard artist, sequence director, and stuff like that,” as he explains, “but I never really got involved in the visual effects. I was surrounded by people who could do that. My designs were used in those sequences, but once I was happy with the design, I’d hand it off to people like Richard Edlund and Dennis Muren to make it work.”

As a feature film director, Johnston collaborated with ILM on The Rocketeer (1991), Jumanji (1995), and Jurassic Park III (2001), providing him with first-person, client-side experience during the era covered in Light & Magic Season 2. He describes how Jumanji, for example, took place during a transitional moment “where it wasn’t always cheaper to do it digitally, or it wasn’t necessarily cheaper to do something with an analog solution. We had to figure out which method was the best to achieve a certain effect.” Johnston worked alongside visual effects supervisor Ken Ralston on that film, a former colleague from the original trilogy. 

“I am a proponent of the idea that any film should not have one more visual effect than it needs,” Johnston comments. “You have the minimal number to help you tell the story and move on. I don’t like films that are all about the visual effects; spectacle for the sake of spectacle. It’s such a waste. You’re not telling the story; you’re just trying to impress people.”

The People Come First

Working across three one-hour episodes, each with its own editor, Johnston followed a number of the precedents established by Kasdan in Season 1, not least of which was the emphasis on individual stories of the artists, filmmakers, and other talent involved in ILM’s work. 

“I hope the audience will recognize that these people at ILM who are revered by visual effects fans are basically just like anybody else,” Johnston says. “They grew up making models or loving technology or whatever it was, and they found a way to make their dreams come true by coming to ILM. It’s interesting because that’s not the way it was on the original trilogy. Nobody knew what they were doing. They didn’t know what they would do when they got hired. That in itself was a voyage of discovery for people. ‘Why am I here, what am I doing? Oh you want me to do that – I guess I better figure it out and learn how.’”

But despite the generational distinction, Johnston does identify the central constant in ILM’s story. “There is an attitude of ‘I know you can do it because it’s impossible.’ That was the spirit in the original trilogy, analog days, and it was during the start of the digital era as well. ‘How are we going to do this? Let’s jump in and figure it out.’ I find that story appealing and interesting. Several of the interviewees talk about it. ‘We didn’t know how we were going to do it. We were running out of time. We’ve got this deadline, we’re working seven days a week, but somehow, we figured it out.’ I think that’s a great story to tell. It’s fun. It’s scary. Scary is good.”

Visual effects supervisor John Knoll with high definition monitors on the set of Star Wars: Attack of the Clones (Credit: Lucasfilm & ILM).

Piecing the Story Together

“Like a lot of feature films, this project was definitely made in the cutting room. You’re assembling so much footage from the last 20 years and beyond,” says Johnston. Documentary filmmakers often have very distinct processes in terms of assembling their narrative elements. For Johnston, this meant close collaboration with the editors to help realize the story he wanted to tell. “I can’t give the editors enough credit. A lot of the ideas came up in the cutting room. They did a fantastic job. They’re semi-sung heroes.”

Johnston also found ways to collaborate more directly with his interview subjects. “At one point, we decided that we needed someone to help tie all of these loose ends together. So we did a second interview with [former ILM general manager] Jim Morris and explained the story we were trying to tell. He got it, of course, being who he is, and he really helped us to cement some of these ideas into a story. It’s always like that in my limited experience. You don’t write a script beforehand like a feature; you write a script in the making of the film itself.”

Johnston was adamant about leaning into the drama of the story, including the challenges that ILM, Lucasfilm, and Jar Jar Binks actor Ahmed Best faced during the release of The Phantom Menace. In Jar Jar, the creative team had pioneered what was the first all-digital main character in a feature film using performance capture technology, which later became industry standard. But some in the press and the audience struggled to accept Jar Jar’s role in the film’s story.

“The whole Jar Jar Binks thing was probably the most controversial feature of the prequels,” Johnston says. “As with any filmmaking project, without conflict there is no drama. I wanted to highlight that.” It was important to be honest about the creative process, which is full of discussion and compromise. 

“Interviewing [Star Wars producer] Rick McCallum was a similar choice,” Johnston adds. “Rick played a huge role in getting the prequels produced. Most people had a problem with Rick McCallum at some point because he was trying to get everything done as cheaply as he possibly could. He’s an interesting character. I wanted to hear his story.”

Animation Rob Coleman (second from left) and actor Ahmed Best (third from right) with the ILM crew while shooting performance capture for Jar Jar Binks in Star Wars: The Phantom Menace (Credit: Lucasfilm & ILM).

In addition to interviewing George Lucas, Johnston chose director Gore Verbinski as one of Light & Magic Season 2’s other filmmaker interview subjects. Verbinski collaborated with ILM on a watershed string of features, including three Pirates films and Rango. “The Pirates films that he directed were interesting because ILM had to keep besting themselves, and Gore tells that story quite well.

“I wanted to feature Rango for the very reasons that Gore says in the interview, which is that ILM always had the ability but never the opportunity to be part of a project where they’re actually telling the whole story,” Johnston continues. “That was unique to ILM, and unique to that project. I came away, personally, hoping that ILM gets more opportunities to do things like that. Having experienced the situation that Gore explains where ILM does a shot, and they don’t know exactly where it’s going to cut in, they’re basically working on something in isolation. For them to be able to not think that way and tell the whole story was groundbreaking for ILM. That’s another story that was important to tell.”

Finding Inspiration

With the open mind of an artist, Johnston reiterates that he “never walked into an interview or the cutting room knowing exactly what something was going to be. It was a process. There were tons of surprises, things I didn’t know. It was refreshing, in a way. It made me have a newfound love of documentary filmmaking.”

As Johnston looks ahead to future non-fiction stories of his own, he shares his hopes that Light & Magic Season 2 will help to inspire the coming generation of storytellers.

“I would hope that a lot of young, potential filmmakers or visual effects artists would watch this series and say, ‘That person who I really admire had no idea how they were going to get to ILM. They did this thing that they were good at, it was recognized, and they got a call.’ If this is something that people want to pursue, they should recognize that it’s possible. There’s a route to success. There might not necessarily be a formula for success, but there’s a way to find your path if that’s your dream.”

Light & Magic Season 2 is streaming now on Disney+.

Visit Lucasfilm.com to learn more about the stories told in the series’ latest installment.

New apparel and a tumbler celebrating the 50th anniversary of Industrial Light & Magic are now available on Amazon.com.

Lucas O. Seastrom is the editor of ILM.com and a contributing writer and historian for Lucasfilm.

ILM and Lucasfilm reveal new information about the virtual and mixed reality Meta Quest experience while Celebration Japan attendees get an exclusive first look.

There’s a new kind of adventure in development for the galaxy far, far away. Industrial Light & Magic and Lucasfilm have revealed more details about Star Wars: Beyond Victory – A Mixed Reality Playset for Meta Quest headsets. Fans attending Star Wars Celebration in Japan this weekend will be able to experience a special hands-on first look at the title.

Star Wars Celebration has always been a place where fandom meets passion and we wanted to bring something to the show this year that our team is over the moon about,” said Alyssa Finley, executive producer of Beyond Victory. “We’re calling this a Playset because it isn’t just a game; it’s an entirely new way to experience the Star Wars galaxy and the worlds we create at ILM. This new mixed reality [MR] experience blends the physical and digital worlds in a way that’s unlike anything we’ve done before and we’re so excited to share a special first look with our incredible Star Wars community.”

Beyond Victory will take fans into a story rooted in the fastest sport in the galaxy: podracing. “We started by asking ourselves some questions,” explains director Jose Perez III. “What kind of toys would be amazing in mixed reality? What toys don’t or can’t exist in real life? Podracing zipped straight to the top.”

“One word: Sebulba,” Finley adds. “Sometimes you just wanna go fast and win races. Sometimes you want to learn from (or about) the greatest racer ever to throw a wrench into an engine. In Beyond Victory, we see firsthand the gritty underbelly of the podracing world. We dig into what happens around the racing circuit, and we get to try our hand at MR podracing along the way. If that’s not ideal what is?”

Set around the timeline of Solo: A Star Wars Story (2018), Beyond Victory centers on up-and-coming racer Volo Bolus who joins forces with the unforgettable Sebulba, originally seen in Star Wars: The Phantom Menace (1999). Fans will be able to experience this new story in three distinct modes:

  • Adventure: Using a combination of virtual and mixed reality, the story follows Volo – an aspiring podracer whose life gets flipped upside down under the mentorship of the infamous Sebulba.
  • Arcade: Experience the thrill of podracing in mixed reality on a virtual holotable that brings players right into the heart of the action like never before.
  • Playset: Transform the physical world around you and create your very own incredible Star Wars moments in mixed reality with a collection of unlockable virtual action figures and vehicles. 

“I genuinely love our story in Adventure mode,” explains Perez III. “It feels like a proper little MR journey. That’s a big one for me, personally. But honestly, the coolest thing for me is that all these different modes will co-exist. We’re working on this super cool playset with all these different kinds of experimental MR features. It’s interesting to see how we can push these kinds of technologies and stories in MR. We are taking some big swings on Beyond Victory. It’s really different from what we have made before.”

The ILM and Meta Quest experience at Star Wars Celebration features over a dozen playable Quest stations equipped with the first ever hands-on look at Beyond Victory. This first look experience offers a glimpse into the tale of Volo and leads into a thrilling top-down Arcade podrace that plays out on a virtual holotable. Additionally, demos for award-winning titles Vader Immortal: A Star Wars VR Series and Star Wars: Tales from the Galaxy’s Edge are available for those in attendance at Celebration.

“Being in VR can feel like being in a whole different world,” says Finley. “What we’ve added for Beyond Victory is the ability to bring the story worlds and the real world together a little more in MR – so you can be playing, you can be interacting with the story world or playing a race – and you can also see what’s happening around you, integrated together with your play.”

“We’ve definitely leveled up as a team since Vader Immortal and Galaxy’s Edge,” Perez III adds. “We’re bringing forward a lot of the tech we cooked up back then into Beyond Victory, so the nuts and bolts and how you move around in VR will feel pretty familiar if you’ve played our other titles. But the thing about ILM is we’re always innovating. So, even with that foundation, we’re building a ton of new systems and approaching the design in some pretty different ways for this one.”

Star Wars Celebration 2025 is taking place April 18-20 at the Makuhari Messe convention center in Japan. The ILM and Meta Quest activation is located at Hall 4, Booth #20-5. 

Wishlist Star Wars: Beyond Victory – A Mixed Reality Playset now, and watch the ILM.com Newsroom for the latest updates. Visit ILM.com/Immersive to learn more.

New apparel and a tumbler celebrating the 50th anniversary of Industrial Light & Magic are now available on Amazon.com.

The first part of an extensive look behind-the-scenes of the visual effects process for Lucasfilm’s pirate-themed Star Wars adventure series.

By Clayton Sandell

(Credit: ILM & Lucasfilm)

The sprawling, live-action series Star Wars: Skeleton Crew (2024-25) is like a map leading to a visual effects treasure chest. Open it, and you’ll find a trove of 3,200 visual effects shots that seamlessly blend the latest digital artistry along with traditional techniques that both innovate and honor the unique legacy of Industrial Light & Magic.

In creating a new adventure story set in our favorite galaxy far, far away, Skeleton Crew creators and executive producers Jon Watts and Christopher Ford set a delightfully retro tone for the series, which directly informed ILM’s approach to the visual effects.

“Very early on, it was apparent that a big part of the intended charm of the show was that it was going to have this sort of Amblin, ’80s movie sort of vibe to it,” Skeleton Crew production visual effects supervisor John Knoll tells ILM.com. “That extends to more than just how you tell the stories. It also extends to choices like embracing animatronics, monsters, and building miniatures and stop-motion creatures.”

Pulling it off would involve hundreds of talented artists at ILM studios around the globe, including San Francisco, Sydney, Mumbai, and Vancouver, along with a few outside visual effects partners. 

Over eight episodes, Skeleton Crew follows the adventures of Wim (Ravi Cabot-Conyers), Neel (Robert Timothy Smith), Fern (Ryan Kiera Armstrong), and KB (Kyriana Kratter)—four kids living a peaceful, if mundane, life on their home world of At Attin. After discovering a mysterious buried space cruiser, the four friends unintentionally launch themselves into hyperspace and must find their way home by navigating a dangerous galaxy of allies, enemies, pirates, and monsters.

Early in preproduction on Skeleton Crew, Knoll says the ILM team had to determine the best way to approach the show’s varied visual effects needs. “It just read like an expensive show because of all of the different planets we go to, all the different types of creatures, and the different environments,” explains Knoll, who also serves as ILM’s executive creative director and senior visual effects supervisor. “Trying to figure out how to make that affordable was one of the first things that faced the visual effects team.”

Following a methodology first established during The Mandalorian (2019-2023), Knoll says Skeleton Crew production was divided roughly into thirds. “About one-third of it was shot in our StageCraft LED volume, one-third was shot on soundstages with conventional sets, and then one-third was shot on a backlot,” Knoll reports.

Galactic Planet-hopping

Skeleton Crew unfolds across multiple worlds that are brand new to Star Wars, beginning with At Attin. The planet’s suburban-like residential neighborhoods utilized a minimal exterior set located near the California State University Dominguez Hills campus in Carson, California. “There was an undeveloped lot that was just adjacent to the campus that was available. So we shot on that,” Knoll says. The practical parts of the set consisted of only the street, a sidewalk, parts of a few houses, and a small patch of grass surrounded by a large blue screen background, says ILM visual effects supervisor Eddie Pasquarello.

“We added all the trees, houses, skies, and trams,” Pasquarello reveals. Even the street was narrowed. “Some things are not seen, and those are the ones that are the most impressive in my opinion, because you’re not saying, ‘Oh, that’s visual effects.’ We’re hoping people watch the actors and enjoy the story versus worrying about the environment.” 

Wim and Neel board a tram for the ride to school, a sequence that introduces the more urban areas of At Attin. Artists digitally extended the school’s exterior—shot on another minimal set—and helped create an expansive cityscape designed to suggest At Attin’s backstory.

“[Jon] Watts wanted it to feel like a place that was built some time ago, but it’s been mostly kept up pretty well. And it’s a place where everyone more or less follows the rules,” says ILM animation supervisor Shawn Kelly.

On the ride to school, Wim stares out the tram’s back window as the vehicle drops into an underground tunnel. After the scene was shot, artists were asked to enhance the movement of both the tram and the camera, requiring complex digital layering work to achieve the right perspective. “We had to split apart all the kids inside the bus to get the proper parallax,” Pasquarello explains. “There’s a ton of artists that helped in layout, and comp and environment—all across the board—that made the shot work.”

Pasquarello says a number of ILM teams also worked throughout the production to develop the right look for At Attin’s city architecture. “This was a really Herculean effort,” he notes. “This is a huge environment build from the team. But it also takes the disciplines of animation and lighting.”

In one shot where a malfunctioning hoverbike leaves Fern and KB stranded on the side of the road, Jon Watts asked ILM to enhance the background with a custom building. “He sent us a photo of a mall,” Pasquarello says. “He said, ‘I kind of want it to look like the mall that I remember as a kid.’ And that’s what that is inspired by. We basically took that photo and ‘Star Wars-ified’ it.”

Neel Nation

One of the earliest discussions among the Skeleton Crew creative team was how to bring Wim’s best friend Neel to the screen. “Neel was a fun and interesting challenge,” Kelly tells ILM.com, noting that the blue elephant-like character is a three-way creative partnership combining Smith’s voice and performance, the work of performance artist Kacie Borrowman, and extensive digital creativity.

“The production was feeling like Neel probably needed to be computer graphics throughout,” Knoll says, explaining that the hours spent applying makeup or prosthetics to Smith would have cut into the child performer’s limited shooting window. “Just seeing how often Neel was going to be on screen—he’s on every other page of every script—he was potentially going to be the most expensive part of the entire show,” recounts Knoll, who set a goal of reducing the all-digital Neel shots by half. “I thought, ‘there’s got to be some practical version of Neel that we can do, at least for over-the-shoulder and wide shots.’”

For that mission, ILM turned to Legacy Effects, a frequent collaborator on Star Wars projects including Ahsoka (2023 – present) and Obi-Wan Kenobi (2022). “Neel’s head was built by Legacy as a fully animatronic puppet and was meant to do a lot of the heavy lifting of the performance,” says Pasquarello.

Credit: (ILM & Lucasfilm)

Neel’s many facial expressions developed from an innovative fusion between the Legacy puppet and considerable digital augmentation. “As they started filming the show, everyone fell in love with how the practical puppet face works,” Kelly recalls. “It’s very charming.”

Digital animation took over in scenes where the story called for subtle emotional expressions that were beyond the capability of the puppet, Kelly says, noting that roughly half of Neel’s shots are either digitally augmented or completely digital. “We came up with a bunch of facial expressions,” he explains. “There’s ‘worried.’ We’ve got ‘scared.’ We’ve got ‘sad’ Neel and ‘happy’ Neel, the Neel that we love. Sometimes we just really need to scrunch up his face and we could scrunch it up with or without his ears, things like that.”

Even in shots where the practical puppet head was used on-set, artists digitally erased a small mesh screen on Neel’s trunk that had allowed the performers inside to see and breathe more easily. ILM lead creature modeler Jonathan Sabella also helped digitally sculpt the computer graphics version of Neel to make sure it was identical to the puppet. “That might just be adjusting neck wrinkles or the trunk, and he can shape it back and make it just right or push the emotion even a little further than our out-of-the-box controls could do. Jonathan was a really key part of bringing this together,” says Kelly.

During shooting, facial capture technology created by ILM Technoprops was used to record Smith’s performance. “In the end, we didn’t use the facial capture directly,” Kelly says, explaining that Neel’s expressions were instead crafted by animators in order to more closely match the style of the puppet.

“We could have gone with a bigger performance,” Pasquarello adds, “but a lot of it was really leaning in and matching the aesthetic that was established. If we were to do something beyond that, it felt wrong because we were losing that kind of simple on-set practical aesthetic, which is a very Star Wars aesthetic. It’s always best to have this mix of different techniques happening at once. It creates the best illusion for the audience. It’s hard to pin down what’s going on if some of it’s real and some of it’s not.”

Rise of the Onyx Cinder

At the bottom of a forest ravine, the kids discover the entrance to a long-buried, hidden starship called the Onyx Cinder. Wim unwittingly activates the dormant vessel, causing it to lurch skyward with the four kids still on board. As massive layers of soil, rocks and trees cascade off the rising ship, the kids try unsuccessfully to escape. “This was a sequence that went on for a while for us,” says Pasquarello. “Just moving all that earth and lifting that ship and having it turn over was a big challenge.”

Live action plates of the four young actors standing on a small set were completed with an entirely digital environment. “The hatch and the four kids. That’s all we had to work with,” Pasquarello remembers. “They were just standing on a small practical piece of the ship, and then everything else was added around them.”

Digital doubles were also created for all the characters and used throughout the sequence, especially useful for shots that might have been perilous for the young actors. “Sometimes when they’re hanging out of the open porthole, they’re animated,” Kelly says. “The animated Wim is waving to his dad.”

Various simulations—from tree leaves, to swirling embers, dust, and engine vortices kicking up dirt—help complete the sequence. “I think this really shows off the world-class effects team and environment team. I’m just always blown away by this sequence,” says Kelly, noting that many of the forest scenes were created with the help of artists in ILM’s Vancouver studio.

Once in space, the kids discover the ship’s first mate, a droid named SM-33 (voiced by Nick Frost). The character was realized using a Bunraku-style puppet (operated by performance artist Rob Ramsdell) and fully-digital versions, depending on the scene.

The Onyx Cinder first came to life as a 3D computer model built by Rene Garcia and Jay Machado and textured by Kim Vongbunyong. Veteran ILM modelmaker John Goodson then crafted a practical version that included rotating sections and flickering LED lights in the engines. “It’s very old school. It’s all handmade. There are a handful of model kit parts on it for detail. But even a lot of those are handmade,” Goodson says. “It’s styrene and acrylic with an aluminum armature inside of it.” Modelmaker Dan Patrascu also helped build the Onyx Cinder chassis and mounted motors inside the model.

“It gets designed in the art department,” Knoll says. “Then you validate the design, so everybody’s happy with it. John builds his version of it. And then we true up our computer graphics model to match what John did. Something I really liked about the model John built was that the paint finish was beautiful on it. And so that was very extensively photographed and then we re-textured the CG model, based on what John had done.”

The practical model was then mounted on a motion-control rig at ILM’s San Francisco studio, reminiscent of the original Dykstraflex system first pioneered during production of Star Wars: A New Hope (1977).

“[Executive producer] Jon Favreau was pretty enthusiastic about wanting to do this back for season one of The Mandalorian, and I was one of the few people still left at the company that used to do motion-control,” says Knoll. “And we figured, ‘We can make this work.’ Probably the biggest obstacle was budget. The reason that we don’t do this as often as we used to is that it’s more expensive than computer graphics. And the best way that I could figure out how to make this affordable for the show was if this was being done as a garage operation.”

Credit: (ILM & Lucasfilm)

Knoll repurposed the motion-control rig he built in his garage for The Mandalorian, adding the capability to drive more motors on the Onyx Cinder. “The system that I built for season one and two of Mando could drive eight motors,” Knoll recalls. “That gave me track, pan, tilt, and focus, and yaw-pitch-roll on the model. That was sufficient for everything we needed to do with the Razor Crest. But all the engines pivot on the Onyx Cinder, so there are four motorized axes built into the ship. Eight axes isn’t enough to drive all of that. So I expanded the electronics to drive 16 channels.”

Camera moves were first plotted out in Autodesk Maya, approved by the filmmakers, then translated to the motion-control system with a goal of matching a long-established Star Wars aesthetic. “Our approach for the shots that were going to be a miniature is, first—we animate it in the computer, and we figure out, ‘what’s the best way to tell this story?’” says Shawn Kelly. “What’s the coolest camera move that still feels like an original trilogy camera move that tells the story and has the mood that we want, and the ship has the motion that we need, in the path that we need?”

The motion-control system was operated by Lindsay Oikawa Pflum and utilized Canon DSLR camera technology. Each shot required a dozen or so passes to capture varieties in exposure and lighting to give compositing teams more options when layering the final image. And in another throwback to ILM’s early days, converters allowed the use of older Nikon lenses that were used to film models for Star Wars: Return of the Jedi (1983). The final result is a flawless collaboration between the real-world model and digital model, all paying homage to ILM’s legacy.

First Stop: Starport Borgo

The Onyx Cinder docks at a nefarious pirate hideaway, a wretched hive called Starport Borgo where the kids hope to find directions back to At Attin. Built into an Outer Rim asteroid overlooking a sea-blue nebula, Borgo is filled with a host of untrustworthy pirates, creatures, and scoundrels. “It’s just a really beautiful, new place for Star Wars,” says Pasquarello. “Everything outside is computer graphics. When we’re inside in Port Borgo, it’s practical. There’s a lot of storytelling in a very small amount of space.”

Port Borgo scenes relied heavily on ILM’s StageCraft LED volume – located at the MBS Media Campus in Southern California. The environment came to life using a combination of practical sets and virtual backgrounds displayed on the volume screens. The virtual production team relied on two real-time rendering engines, depending on the scene: Unreal Engine from Epic Games and ILM’s proprietary software, Helios.

Skeleton Crew also took advantage of two powerful new StageCraft volume advancements: virtual depth of field and real-time virtual lighting. “Previously, when you used depth of field, the camera didn’t actually make the content go out of focus correctly when the depth of field changed as it just defocused the wall global,” says ILM virtual production supervisor Chris Balog. “Now we’ve added virtual depth of field. So when you change focus, the content defocuses in depth. So virtually now if the camera is focusing on something close to the wall, the 3D content in virtual space close to the stage will be sharper like the set in front of it, and everything in 3D space past that will defocus correctly in depth based on the lens’ focal length.”

The new depth of field capability came with the challenge of how to accurately represent the “bokeh” effect – the quality and appearance of blurred light sources in out-of-focus areas of a shot.

“It gives it more realism because it actually defocuses the way it should. Before, it would just get really soft,” Balog explains. “And now, we are able to do this in a way where it would photographically bokeh like real light sources.”

Real-time virtual lighting gave the Skeleton Crew cinematographers greater flexibility and speed when adjusting practical lights on set, making it easier to match their digital counterparts. “It used to be a much more labor-intensive process, because originally we were baking all the lighting into the original content,” Balog says. “Now, the DPs can get on set that day and say, ‘You know what? I just want to move that light a little bit.’ So we just move the virtual light to work in conjunction with it.”

Creating the content for the volume walls happened near the beginning of a production.

“There’s a team of generalists, or gen artists, who are talented in a lot of different aspects of computer graphics,” Shawn Kelly says. “And while they are working on the environments, me and a few other people are working on populating those environments.” 

Wim, Fern, KB, and Neel disembark the Onyx Cinder and hitch a ride on a bubble-like dinghy piloted by a furry Teek ferryman. Dockside, the Teek jumps on Fern’s shoulder to demand payment—a sequence that demonstrates an invisible combination of digital and practical methods.

“He’s mostly a practical puppet up on her shoulder, but his arm is animated. His arm is computer graphics so we can do more delicate kinds of gestures with his fingers and hands,” explains Kelly, “but we still try to animate it in a way that feels like a puppet.”

“We have a great paint team here,” adds Pasquarello. “It was not a big deal to remove that arm and replace it.”

Once the Teek gets his money, he jumps down to leave—a shot that features a flawless “Texas Switch” between the practical and fully digital version of the character. “At the beginning, he’s a puppet. And once he goes behind Fern’s back, he’s animated,” Kelly reveals. The shot concludes with the ferryman scurrying away, mimicking the speedy movements of the original Teek that first appeared in Lucasfilm’s TV movie, Ewoks: The Battle for Endor (1985).

“He’s this little, very fast-moving kind of funny guy,” says Kelly. “It was really endearing and fun, especially when I was a kid. So we wanted to put a little bit of that fast movement into him. And this is a little example of how we kept that flavor.”

Credit: (ILM & Lucasfilm)

Motion Capture Cameos

Motion capture performers help populate the expansive setting with hundreds of pirates. “A place like Port Borgo needs to be a bustling port of pirates doing stuff,” says Kelly. “So we spent months at the beginning getting mocap performances and animating on top of those, and also key-framing guys selling stuff at stalls, or shopping at stalls. You’ll see guys in the background unloading a ship, and there’s a chain of guys throwing boxes to each other, stuff like that.”

The children pass by a seedy nightclub where four-armed aliens are dancing in reddish silhouette through frosted windows. It was Kelly’s job to direct the scene’s motion capture performers, including two unexpected names: Daniel Kwan and Daniel Scheinert, collectively known as The Daniels.

At the time, the directors were helming the fourth episode of Skeleton Crew and would soon win an Academy Award for Best Picture for their film Everything Everywhere All at Once (2022). “The Daniels wanted to perform the dance,” Kelly laughs, recalling how it became his job to direct two of his cinematic heroes on how to be better exotic dancers. “I’d say, ‘I think they want it to be sexier.’ They’d just burst out laughing, and do it again,” Kelly says. “They were really fun and funny.”

Credit: (ILM & Lucasfilm)

Escape from Port Borgo

Reluctantly teaming up with the mysterious Jod Na Nawood (Jude Law), the children escape from the pirate brig and navigate their way back to the Onyx Cinder. As the ship pulls away. it’s snagged by a refueling line connecting it to a floating buoy, snapping it back like a balloon on a string. Jod tries desperately to maneuver away, dragging several pirate vessels with it.

“They’re creating havoc,” Pasquarello says. “The whole idea of the pile up and pulling those ships together was a really fun sequence, because even Jon Favreau chimed in. Everyone had some ideas about how to make that really successful.”

The colliding ships are all-digital creations, with the action handled by a team of artists who are now part of ILM’s Sydney studio. “All of these ships are computer graphics, and the environment itself as well,” Pasquarello says. “These didn’t exist as models from a practical standpoint.”

As the pirates take aim at the Onyx Cinder with a tower cannon, Jod sends the ship into hyperspace. The fuel line snaps violently, whipping back and crashing into the crowded port. “You can see our animated pirates getting knocked down and running away,” Kelly says. Effects passes helped complete the shot with a variety of explosions, fire, and sparks.

The pileup sequence also gives eagle-eyed viewers a chance to catch a special Easter egg—a Starspeeder 1000 transport, well known to fans of the Star Tours attraction at the Disney Parks.

ILM.com’s behind-the-scenes journey through the creation of Star Wars: Skeleton Crew continues in part two….

This story was updated with additional information on May 2, 2025.

Clayton Sandell is a Star Wars author and enthusiast, TV storyteller, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell) Bluesky (@claytonsandell.com) or X (@Clayton_Sandell).

The new experience for the Meta Quest headset will be introduced to fans at Star Wars Celebration Japan.

Industrial Light & Magic and StarWars.com have revealed the newest immersive experience coming to the galaxy far, far away…

Star Wars: Beyond Victory – A Mixed Reality Playset is currently in development for Meta Quest headsets and takes players into the fast-paced, high stakes life of a podracer. Sporting various modes of play, the experience will be introduced to fans at Star Wars Celebration Japan from April 18-20 at the Makuhari Messe Convention Center near Tokyo.

“We’re beyond excited to share an early look at this new experience with the incredible Star Wars community at Star Wars Celebration this year,” says director Jose Perez III. “Our goal at ILM has always been to find new and exciting ways for players to experience Star Wars stories. Focusing on mixed reality has opened several fascinating doors from an immersion standpoint and allows us to literally bring a galaxy far, far away right into the comfort of players’ homes in a way that’s unlike anything we’ve done before.”

Star Wars: Beyond Victory is the latest initiative in ILM’s continued efforts to fully integrate immersive storytelling and interactive experiences across the entire company.

Fans attending Star Wars Celebration will find the ILM/Meta activation at Hall 4, Booth #20-5. Along with an introduction to Beyond Victory, they can pick up an exclusive giveaway Marvel comic of the same name. The prequel story to the mixed reality playset is written by Ethan Sacks with cover art (pictured below) by Phil Noto and interior illustrations by Will Sliney, Steven Cummings, and Shogo Aoki.

The Marvel comic book cover for Star Wars: Beyond Victory, featuring new podracer characters.

To learn more about Star Wars: Beyond Victory – A Mixed Reality Playset, visit StarWars.com, and for the latest about ILM’s work in immersive entertainment, visit ILM.com/Immersive.

With ILM as a lead contributor, the Disney+ series took home the award for Outstanding Visual Effects for a Live Action Program at the 3rd Annual Children’s & Family Emmy Awards.

The logo of the 3rd Annual Children's and Family Emmy Awards.

Based on the popular books by Rick Riordan, the Disney+ series Percy Jackson and the Olympians earned eight wins on 16 nominations at the 3rd Annual Children’s & Family Emmy Awards in Los Angeles on March 15. Among them was “Outstanding Visual Effects for a Live Action Program,” for which Industrial Light & Magic was a lead contributor.

ILM’s Emmy winners include visual effects supervisors Jose Burgos and Jeff White, visual effects producer Katherine Chambers, executive visual effects producer Adele Jones-Venables, virtual production supervisor Sonia Contreras, associate visual effects supervisors Donny Rausch and Daniel Schmid, and associate visual effects producer Shawn Smolensky.

Percy Jackson’s senior visual effects supervisor Erik Henry accepted the award on behalf of the visual effects team, thanking ILM and other contributing effects houses MPC, Hybride, and Raynault. The 3rd Annual Children’s & Family Awards is available to stream from the National Academy of Television Arts & Sciences.

Congratulations to our ILM Emmy winners! Watch the trailer for Percy Jackson and the Olympians:

Industrial Light & Magic’s immersive entertainment team will be fully integrated with the rest of the company to inspire new innovations in cross-platform storytelling.

50 years since its founding, Industrial Light & Magic has never rested on its laurels. A hallmark of ILM’s endurance over half a century and counting has been its knack for adapting to change and embracing new creative opportunities. “ILM was created by George Lucas because there was no other way for him to realize his vision for Star Wars,” senior vice president and general manager Janet Lewin tells ILM.com. “From the beginning, our mission has been to make the impossible a reality.”

The ILM spirit that’s evolved over five decades and at studios in as many different countries is “motivated by the opportunity for reinvention, evolution, innovation, and resilience,” as Lewin puts it. ILM’s ability to “react and adapt to the changing dynamics” of an ever-changing industry has been key. Time and again, ILM has increasingly broadened its creative output. “We’re known for our work-for-hire visual effects in feature films,” says Lewin, “but we’ve also branched out into streaming series, feature animation, and of course, the incredible work that Vicki Dobbs Beck has championed with immersive storytelling.”

“ILM was created by George Lucas because there was no other way for him to realize his vision for Star Wars. From the beginning, our mission has been to make the impossible a reality.” -Janet Lewin, Senior Vice President & General Manager, ILM

As vice president of immersive content for ILM and Lucasfilm, Beck co-founded ILMxLAB (later ILM Immersive) some 10 years ago. What was initially a move to experiment with storytelling in the emerging fields of virtual reality, augmented reality, and mixed reality has since yielded broader implications for the way ILM will do business. “This opportunity allowed us to participate directly in the success of a project and drive these experiences from concept to launch, delivery, and support,” notes Beck.

Building on work first pioneered by Lucasfilm’s Advanced Development Group, the immersive team leveraged the highest quality visuals and sounds combined with meaningful interactive principles to create stories with groundbreaking potential. These have included productions like PGA Innovation Award winner Vader Immortal: A Star Wars VR Series and Emmy-winning “What If…? – An Immersive Story,” as well as projects made with creative partners like Alejandro González Iñárritu’s Academy Award®-winning CARNE y ARENA. 

“We see opportunities for social experiences that are associated with our stories,” explains Beck. “We started by inviting audiences to ‘Step Inside Our Stories’ in ways they’d thought were impossible. We’re now transitioning from storytelling to storyliving, which is a much bigger idea. You’re in a world and you’re making meaningful choices that drive the narrative forward. That’s the gateway to take advantage of new technologies that are coming on the scene.”

ILM is now “seizing the moment,” as Lewin puts it, to include all of ILM in this undertaking. “Immersive storytelling is becoming more and more relevant to our audiences and our partners. On the visual effects side, ILM has been involved in projects like ABBA Voyage – a first-of-its-kind – and the content we made for Sphere Las Vegas. We started to see that the projects coming out of our immersive line of business had a natural convergence of techniques, talent, and opportunity with those of our visual effects business. 

“We’re now transitioning from storytelling to storyliving, which is a much bigger idea. You’re in a world and you’re making meaningful choices that drive the narrative forward. That’s the gateway to take advantage of new technologies that are coming on the scene.” -Vicki Dobbs Beck, Vice President, Immersive Content, ILM & Lucasfilm

“We can proactively leverage the strengths of our visual effects artists, pipeline, and storytelling passion with those of our immersive artists who are true experts in interactivity,” Lewin continues. “We see this ‘storyliving’ concept as the key growth opportunity. Not only do we want to market ourselves as one brand for audiences and clients, but we also want to empower our artists. This will allow for more cross-pollination of techniques, more opportunities for artists to move between types of projects, whether it’s an animated feature or our collaboration, “What If…? – An Immersive Story,” with Marvel Studios. If we can provide those opportunities, it allows us to keep attracting the very best talent in the industry.”

With every group now aligned under the ILM brand, the company will pursue an integrated portfolio that includes film, television, attractions, interactive products, and live events. For the immersive team, it’s a milestone following a decade of concerted effort, and for ILM as a whole, it’s the newest chapter in the company’s pioneering story. “We are poised for the next 50 years of ILM’s existence,” says Beck. 

ILM’s position as a storied entity with a globe-spanning team of artists, designers, and engineers opens up limitless possibilities. “The world is our oyster,” as Lewin puts it. “This is a time for growth and expansion. I’m really excited about the ideas that we’re exploring. This is a moment when we can redefine who ILM is in the market, be more consumer-facing, and continue to be the industry leader. I also love the idea of having more efficiency and refining a better process. These moments allow us to examine the way we work and improve it. We can bring fresh, new ideas to the table.”

And as Beck describes, this is not only an opportunity to position ILM as the best creative partners and visual storytellers, but also as “aspirational leaders” who will inspire the next generation of storytellers. “This is a way for ILM to drive its destiny in a way that has not been possible in the past. To embrace cross-platform storytelling is what we are uniquely positioned to do. If we can take advantage of that capability and build ecosystems of experiences that cross different types of media, it gives us an incredible canvas to paint on.”

“This is a time for growth and expansion…. This is a moment when we can redefine who ILM is in the market, be more consumer-facing, and continue to be the industry leader.” -Janet Lewin, Senior Vice President & General Manager, ILM



To discover more about ILM’s work in immersive storytelling, visit ILM.com/Immersive. And for all the latest news and stories from the company, visit the ILM.com Newsroom.

For over two decades, Wicked has transported theatre audiences into the untold story behind The Wizard of Oz (1939). Bringing its magic to the screen required a spectacle that not only honoured its Broadway origins but expanded beyond them. Among the many key collaborators in this transformation was Industrial Light & Magic’s Pablo Helman, production visual effects supervisor. In a recent conversation, Helman shares with ILM.com the challenges and triumphs of adapting this theatrical phenomenon for film, seamlessly blending practical techniques with cutting-edge visual effects to create an enchanting cinematic experience.

By Jamie Benning

(Credit: Universal)

The Musical Challenge

Helman admits that working on Wicked (2024) was an entirely new experience for him compared to his experiences on films like the Star Wars prequels (1999-2005), The Irishman (2019) and The Fabelmans (2022). “I think I was ignorant, in that I thought for the last 30 years that my job started with the images and ended with the images,” he tells ILM.com. “Normally music is something that happens later on. But with this movie, not only is there pre-recorded material but there is live singing. And there’s connections that are being made between the actors while they sing. Things that change between them that makes them elicit other reactions. And you’re there three feet from the action with the music happening. That translated into how we approached the visual effects. It makes all the difference.”

But understanding how to integrate those effects meant first understanding director Jon M. Chu’s vision.

Getting into Jon M. Chu’s Head

Helman is known for his thorough preparation when working with directors, and his collaboration with Jon M. Chu on Wicked was no exception. “It’s my job to get into the director’s head,” he says. “When I first interviewed with him just to see how we would click, I did a lot of research on him as a filmmaker. How does he use the cameras, camera movement, lighting, sequencing, editing—all of it. So we basically had the same language because I kind of cheated a little bit. I made it my business to understand how he goes about his process.”

He explains that Chu’s methodology is unique. “He has an incredible vision for the movie, and then he’s open enough to let the movie happen to him. Sometimes the movie develops in a way that is unexpected, and it grows in a specific way or something happens, and then he says, ‘I never thought of it this way. Look, that’s great. We have options.’

“The funny thing about it is that once the project concludes,” Helman continues, “a little bit of their filmmaking stays with me as a kind of a tool set of things, that once in a while I pull out, and that helps me with something else. Every director is different. Everybody has a different process of understanding the storytelling.”

Unlike directors who rely heavily on previsualization, Chu prefers a more fluid, organic approach, embracing spontaneity throughout the filmmaking process. “He doesn’t use pre-vis the way some directors do,” Helman says. “We’d sit at a table with the heads of department, with Alice Brooks (Director of Photography), Myron Kerstein (Editor), with Nathan Crowley (Production Designer), Paul Corbould (Special Effects Supervisor), Jo McLaren (Stunt Coordinator) with a model of the set for that scene. And then we give him a little stick with a little Elphaba and other characters and he goes in and shows us what the scene is about. And when there is music, he plays the music from his phone. Then he does the movement of the actors with Chris Scott, the choreographer. And so we video those sessions and then we all go away and try to figure out how we’re going to achieve Jon’s vision—between all of us.

“I did a lot of listening, because I hadn’t worked with Jon before,” Helman continues “And Jon and Alice and Myron kind of grew up together from school, and they have been working together. And so for me to come in, it’s like, are they going to let me in? And they opened their arms and let me in. And it was a wonderful experience.”

Building Oz: The Role of Practical Sets and the Practicalities of Shooting a Musical

Nathan Crowley’s elaborate practical sets played a crucial role in grounding Wicked’s fantastical world. Helman reflects on how these sets benefited both the cast and the visual effects team. “You want to chase the truth as much as possible,” he says. “Yes, there were nine million tulips planted, but they were planted a hundred miles from the set. But there are benefits.”

He elaborates on the process of blending the practical and digital elements. “There’s a shot in the beginning, of kids running through the tulips towards Munchkinland, and the matte line is around the kids. You know, after that, it’s all visual effects. Is it useful? Yes, for the actors, they have something there. But we changed the lighting, added the sun, and completed the tulips in post. The final look is a collaboration.”

The barley fields posed another challenge. “We planted real barley, but during the first take, you couldn’t run through it—it was too dense. We had to shave it down and then digitally replace everything to maintain the illusion.”

Helman explains that the scale of the production meant nearly every frame of the film required some level of visual effects intervention. “There are 2,200 visual effects shots in the movie. So every shot is a visual effects shot. Because this is a musical, all the actors are wearing really big earpieces that had to be replaced in 3D. There were also mics on their chests.”

The scale of the sets also dictated when practical elements could give way to digital enhancements. “The interior sets go up to 25 feet and the exterior sets go up to 55 feet and then after that we take over as visual effects,” says Helman. “Special effects were really big too.Paul Corbould and his team built a huge train. But the gears were not moving. So that’s where visual effects lends a hand. And the gears under the train are visual effects. And the inside of the train is visual effects because there was a small section built, but not the whole thing.

“And then the train was very reflective,” Helman continues “So if the camera follows it, then you have the reflection of all the lights and everything else that had to be recreated and painted out. So yes, there is a combination of reality and not reality. It’s a realization that we are creating an illusion, all of us. And we all contribute little by little to that illusion. And then in post, we put it all together and complete it.”

Striving for Authenticity: Cynthia Erivo’s Green Transformation

Helman and his team explored multiple approaches to achieving Elphaba’s distinctive green skin, testing a range of methods to determine the best solution. “Yes, we did a lot of testing,” he recalls. “We did different tests of what would happen if we used green makeup, what would happen if she didn’t have makeup, but we were there to fix everything that couldn’t be done.”

Ultimately, it was actor Cynthia Erivo herself who made the final decision. “Cynthia said, ‘I need to be green. I think I need to be that person,’” Helman explains. “And I know it’s three hours in the chair, but I need to put in that time to become that character. And it made a difference, I think.”

Even with practical makeup, the visual effects team played a crucial role in refining the look throughout the film. “We still have visual effects in every shot,” Helman says, citing the long shooting days, the strain of makeup on Cynthia’s skin, and even the challenges of contact lenses. “She had contacts. And I knew from other experiences like The Irishman that after a while the contacts start moving and the actor starts looking cross-eyed. So we had to fix all kinds of things.”

Additionally, subtle digital enhancements were required for continuity. “The makeup went to the middle of the lip, but not into her mouth, for obvious reasons. So, as I said, we had to adjust every shot,” he adds.

Flying Monkeys and Magical Transformations

One of the many visually striking sequences in Wicked is the transformation of the flying monkeys. Helman describes the scene as both challenging and rewarding. “It’s almost a horror scene. The monkeys are in pain, their wings breaking through their backs. It’s unnatural, which adds to the horror,” he explains. “We used feathers flying around to give a sense of atmosphere and depth. The horror of it had to be mitigated somehow. So there were times when we went too far. There were times when we didn’t go far enough. And then we all kind of adjusted.”

Helman emphasizes the importance of storytelling in visual effects. “There’s the fact that these monkeys need to fly away in like four shots. So how do you tell that story in its specifics in four shots? They need to get the wings, try them, and then be either successful or not. And so all that stuff is storytelling. It’s part of what we do in visual effects. The animation team led by David Shirk did a great job.”

(Credit: Universal)

Grounded Magic

Magic is, of course, central to Wicked, and Helman’s team took a deliberately subtle approach to its visualization. The Grimmerie posed unique challenges. “It wasn’t really thought out when we were shooting,” he admits. “Most of the time, Cynthia was in front of a blue square, gesturing as directed. But we ultimately made it so that the words became golden, with pages moving. It feels tactile and grounded, not over-the-top. We weren’t going for that kind of fantastical thing, because it’s been done before. Even if we were doing a visual effect, it had to look practical.”

Elphaba’s imperfect spellcasting in this scene also adds another layer to her character. “Due to her inexperience, she’s not very good at casting spells. Every time she does, something bad happens,” Helman says. “It’s relatable—magic grounded in imperfection, just like life.”

Defying Gravity: A Pivotal Sequence

The “Defying Gravity” sequence is a pivotal scene bridging the two films, requiring a seamless blend of practical stunts and digital effects. “Cynthia performed many of her own stunts, including being flown on wires and complex rigs,” says Helman “We’ve seen people flying before. You could just have somebody being wired in and you can say to that person, now you’re moving right, now you’re moving left, left to right and right to left. But those kinds of things don’t work. Cynthia was being flown 200 feet around the blue-screen set, singing! She is really trying to keep herself from the forces that are trying to throw her in different directions. And you can really see that she’s doing it. That contributes to the reality of the visual effects work we do.”

Helman also highlights Elphaba’s emotional arc in her final scene through the use of light and symbolism. “You start from the bottom in the darkness towards the light and you go out on the balcony towards the sun, and then the sun starts coming down throughout that sequence towards the end of the movie. If you have seen the play, you know that at the end of the first act, the cape gets bigger and bigger. So the question is, how do we translate that? Do we do it? Is it going to be laughable? The cape is a visual effect, because we couldn’t use a real one due to the wires. Throughout that sequence, everything becomes pictorial. And by the time we get to that shot, basically, it’s a spiritual, religious picture. The clouds are very Renaissance Italian, with the sun behind them and there’s all kinds of volume shadows and volume light coming through. And then all of a sudden you realize, oh my goodness, the cape is huge. What happened? Are you inside her mind, or is that a literal thing? Probably not. And then she does the war cry and the camera pulls back out and you think the movie ends, but she turns around and goes away flying. And then the audience is thinking, ‘wait, wait, where are you going?’ Then the movie ends to get them ready for part two.

“But all those kinds of things are not by coincidence,” Helman adds “They’re each thought out in terms of structural storytelling, building expectations.”

The Collaborative Spirit

The scale of Wicked was immense, involving contributions from more than 1,000 visual effects artists across five countries, ILM in San Francisco and Sydney as well as teams at Framestore, OPSIS, Lola, Outpost and TPO. Helman is quick to credit the teamwork behind the film’s ambitious visual effects. “We’re working together for three years to make these movies. And so I’m really grateful to all of them. Robert Weaver and Anthony Smith were the ILM visual effects supervisors, and David Shirk was the animation supervisor. Great collaboration and lots of fun.”

Helman is philosophical about the creative challenge. “On set sometimes you get into some arguments or differences. Or as Jon calls them, ‘offerings.’ Sometimes you say, ‘I’m offering you this solution, or you can go this way or we can go another way.’”

This cooperative effort was essential on a production as challenging as Wicked. “It’s 2,200 visual effects shots, but every department played a role in making the world of Wicked believable,” Helman explains. He highlights the importance of working closely with Nathan Crowley, Alice Brooks, Paul Corbould, and the rest of the team.

“Alice, Jon, and I talked a lot about it,” Helman says. He describes how lighting played a crucial role in integrating visual effects with the cinematography. “The lights were on the set, but we removed them. If you look at a movie that was shot in the ‘50s, there’s a certain look to it, but you have to achieve a certain look from behind the camera. But that’s not so anymore. You can put light sources wherever you want. And if you’re careful with them, when you remove them, there is no such thing as unjustified lighting.”

By ensuring that visual effects supported rather than dictated the cinematography, the team was able to create a seamless blend of practical and digital elements.

A Lesson in Artistry

For Helman, Wicked reinforced his philosophy that “visual effects shouldn’t be impeding anything. Whatever the director wants to do, wherever they want to put the camera—that’s what we’re there for, to encourage that kind of storytelling.”

The grueling 155-day shoot, filmed in continuity across both parts, pushed the cast and crew to their limits. Helman acknowledges the toll such a long production can take: “After day 70, it’s like everybody’s done. It’s like, elbows are out—‘Get out of my way, why are you looking at me like that?’ Those kinds of things happen.” But despite the fatigue, the shared vision kept the team pushing forward. It is a long project, but it’s a good thing because it gives you kind of a sense of not worrying about anything else, but what you have in front of you.”

The audience’s response helped reaffirm the purpose behind the work. “It’s one of those pictures that I had to go to the theater to hear the people’s reactions. I usually don’t do that. But this one I did, and it reminded me of why we do what we do, which is to make art that is being shared.”

Reflecting on the experience, Helman expresses gratitude for the people who made it possible. “You can have a great project, great people, or great financial satisfaction—if you’re lucky, you get two out of three. But the most rewarding part is the collaboration. At the end of the day, it’s about the people you work with.”

As Wicked continues to enchant audiences worldwide, Industrial Light & Magic’s artistry stands as a testament to the power of collaboration and innovation in storytelling.

Learn more about ILM and Wicked on Lighter Darker: The ILM Podcast.

Jamie Benning is a filmmaker, author, podcaster and lifelong fan of sci-fi and fantasy movies. Visit Filmumentaries.com and listen to The Filmumentaries Podcast for twice-monthly interviews with behind-the-scenes artists. Find Jamie on X @jamieswb and as @filmumentaries on Threads, Instagram, Bluesky and Facebook.


ILM visual effects supervisors are honored in the Special Visual Effects Category.

Today, BAFTA announced their 2025 nominees of which two Industrial Light & Magic productions received nominations in the Special Visual Effects category. Gladiator II and Wicked were each nominated alongside Better Man, Dune: Part Two, and Kingdom of the Planet of the Apes

Congratulations to production visual effects supervisor Mark Bakowski and ILM visual effects supervisor Pietro Ponti on their nomination for Gladiator II, and to production visual effects supervisor Pablo Helman and ILM visual effects supervisor Anthony Smith for Wicked. And congratulations to everyone at ILM who contributed to these incredible films.

The EE BAFTA Film Awards ceremony will be held at the Southbank Centre’s Royal Festival Hall in central London, and broadcast on Sunday 16 February 2025. Click here for a complete list of the BAFTA nominations.

The ILM Vancouver artist details her globe-trotting career path from special make-up effects to art direction to effects supervision.

By Lucas O. Seastrom


For decades, a significant aspect of Industrial Light & Magic’s company culture has been defined by the atmosphere in dailies. These routine sessions where the effects team reviews work-in-progress and provides feedback are common across the industry, but ILM has always prided itself on its distinct style that encourages open and equal communication. Tania Richard had spent some 15 years working in visual effects before she joined ILM in 2018 as an art director at the Vancouver studio. And as she puts it, “ILM’s collaborative dynamic really shines in dailies.”

While working on Space Jam: A New Legacy (2021), Richard was at first surprised when visual effects supervisor Grady Cofer would call on her in dailies, seemingly at random. “Grady wouldn’t hesitate to call my name out and ask me what I thought about something, even if it wasn’t something I was working directly on,” Richard explains. “He valued everyone’s opinion, and made you feel part of the overall process. Earlier in my career at other studios, dailies was pretty quiet and you didn’t speak up very often. Everyone has their own way of approaching things in dailies, but at ILM it’s always with the intent of creating a collaborative experience.”

As ILM has continued its global expansion – which now includes studios in Vancouver, London, Sydney, and Mumbai, in addition to its San Francisco headquarters – seasoned professionals from across the effects industry have joined the ranks. Each brings their unique experience working on diverse projects and often in many different types of roles. Richard is no different. 

Growing up in Sarnia on the southern border of Ontario, Canada, Richard had what she describes as a creative upbringing. Both of her parents had their own artistic pursuits, and her mother in particular encouraged Richard and her brother (now a storyboard artist) to make careers out of their passions. Though she aspired to work in filmmaking from her time in high school, Richard chose to study traditional fine art while studying at McMaster University southwest of Toronto. “But I was lucky in that the university also had film theory courses,” she notes, “so I studied film theory as well as fine art.”

With this unusual blend of disciplines, Richard was able to both learn academic theories and create artworks that attempted to realize them in aesthetic form. She studied sculpture, drawing, print-making, art history, and painting, as well as film theory. Her fascination with the concept of film spectatorship inspired her to focus in painting. “There was a film theorist, Laura Mulvey, who talked a lot about the male gaze in spectatorship,” Richard explains. “I studied her a lot, as well as Cindy Sherman, who would often photograph herself in these film-looking environments and settings. I ended up doing something similar where I’d start by creating these film stills, photographing myself dressed up in various situations, and using that as reference for my paintings.”

To this day, Richard is fascinated by the intersections of artistic craft and theory, in particular the way that filmmakers code their works. “It can almost be a language, a communication between the filmmaker and the audience,” she says. “Somebody like [Andrei] Tarkovsky puts these little codes throughout his filmmaking, whether it’s sound like dripping water or a cuckoo, or a visual like apples. They were all meaningful to him on a personal level. You see and hear these codes throughout all of his films, and if you were familiar enough with them, it was almost as if he was talking to you in a way, on another level.” 

At ILM, Richard has worked with director Shannon Tindle on both Lost Ollie (2022) and Ultraman: Rising (2024), and she describes the filmmaker along similar lines. “He’ll reference the same films in his creative process, like Kramer vs. Kramer [1979], for example. He loves that film, and I’m aware of that because I’ve worked with him long enough and had enough discussions with him to know that when I see something in the way a frame is composed or an animation performance in one of his films, I can understand where his influence is coming from. It’s special. It makes you feel like you’re connecting with the filmmaker on another level.”

As she finished her undergraduate studies, Richard jumped into work at Toronto-based FXSMITH, a special effects company founded by innovative makeup designer, Gordon Smith. Initially thinking she’d be working on a local television show, Richard soon discovered their team’s assignment was the feature film X-Men (2000). Initially, Smith had his new hire drawing concepts for characters requiring prosthetics, and as production commenced, Richard was part of the on-set team creating the extensive make-up for Rebecca Romijn as Mystique. 

“It was a great experience and I had my foot in the door,” says Richard. “But this was back around 1999, and the transition from practical effects to computer effects was happening. For X-Men, we worked closely with the visual effects team on set because they had to pick up a lot of our work in post-production and refine it. In talking to some of the crew there, they encouraged me to move into visual effects.”

Concept art by Richard for Mystique (Rebecca Romijn) in X-Men (2000) from 20th Century Studios (Credit: Tania Richard).

Richard’s brother was then studying classical animation at Toronto’s Sheridan College, a school that had graduated a number of artists later hired by ILM. “If the Sheridan opportunity hadn’t worked out, I might’ve gone for a PhD in film theory,” Richard notes. Joining the school’s postgraduate visual effects program, her main professor was Richard Cohen, recently returned from a stint at ILM as a CG artist on Terminator 2: Judgment Day (1991) and Death Becomes Her (1992).

“There were about 12 of us in the class, and Richard [Cohen] felt that rather than having us all isolated and doing our own thing, we should make a short film together,” says Richard. “If I had not done that, I might’ve focused more on the animation side. But on the group project, we leaned into each other’s strengths, and because I had a painting background, it was clear that I was the concept artist, matte painter, and designer on the team. I did do some animation, but I learned that it wasn’t my strength.” She adds that although she intended to create traditional matte paintings for their film (ultimately titled The Artist of the Beautiful), Cohen urged her to learn Photoshop and embrace the emerging computer-based tools.

As she finished her studies at Sheridan, Richard had already begun professional work, initially as a concept designer for 2003’s Blizzard under production designer Tamara Deverell. She then became a digital matte painter at Toybox, a local effects house that was soon acquired by Technicolor. Eventually, a former colleague invited her to come to Sydney, Australia where Animal Logic was developing the animated feature Happy Feet (2006). “I was young and up for the big move, so I said yes,” Richard comments. “That was back when ‘2 ½ D’ projections were the thing, so I did a number of those mattes on that feature.”

During this period, Richard encountered a number of important mentors, among whom was the late visual effects producer Diana Giorgiutti, with whom Richard served as a concept artist on Baz Luhrmann’s Australia (2008). “We were on location in Darwin and Bowen for something like seven to nine weeks,” Richard explains. “Di had me working directly with [production designer] Catherine Martin. She had me sitting with editor Dody Dorn for a week. Dody had cut Memento [2000]. We were together early on when she had voice recordings of the actors reading the script and she wanted some images to cut in with them. I’d be mocking up frames for her and she explained to me the compositions they needed. She was really generous with her time.”

Soon, London-based Double Negative came calling, and Richard spent nearly a decade in the United Kingdom working on everything from Harry Potter and the Deathly Hallows: Part 2 (2011) to Interstellar (2014). As visual effects art director on Fantastic Beasts and Where to Find Them (2016), she again found an important mentor in production designer Stuart Craig, who’d overseen the visual development of the entire Potter franchise. After creating elevation and sectional drawings for sets, Craig tasked Richard with building digital mock-ups, and together they’d determine the preferred camera angles for which Richard then created detailed concepts.

“Stuart had worked with set designer Stephanie McMillan for many years,” says Richard, “and they would often go onto set together and shoot the space in black and white. That helped them analyze the composition before they started adding color and texture, which only came after they were happy with the black and white composition. When I built my models, I rendered them in black and white as well, so I was approaching it instinctively in a similar way. Stuart loved it and helped me understand why it was a good approach. Rather than going full-tilt and adding lots of texture and detail right from the beginning, you start to learn that actually you might never see a particular area because of the way it’s being lit, or something like that. You learn to focus in an efficient way on where to add that structural detail, where to hit the image with color to have the most impact. It was a brilliant lesson from Stuart.”


A return to Animal Logic for 2018’s Peter Rabbit was Richard’s ultimate springboard to ILM. With the opportunity to work closely with director Will Gluck, visual effects supervisor Will Reichelt, and associate visual effects supervisor Matt Middleton (the latter of whom are both with ILM now), she came to realize that effects supervision was her chosen path. “Will [Reichelt] had me run lighting dailies and look after the assets while he was busy on set,” Richard explains. “I was also really involved in the DI process and had a team of artists who I delegated a lot of design work to, so in many ways, it felt like a natural transition.”

In early 2018, the ILM Art Department’s creative director David Nakabayashi and senior producer Jennifer Coronado convinced Richard to make another move, this time back to her native Canada to work at ILM’s Vancouver studio. It was a significant decision, as Richard was then considering a move to New Zealand for a brief respite from active work. But the opportunity to join ILM was too important to pass up. 

“ILM was the pinnacle,” Richard says frankly. “For anybody who is around my age and grew up with Star Wars, you see ILM as the height of where you want to be in the industry. But I wasn’t sure I had what it took to be a part of the company, so it was a surprise when they reached out. I barely took any time off between working on Peter Rabbit and coming to ILM.”

Initially working as an art director, Richard describes her first impressions of ILM as “overwhelming, exciting, and different.” After assisting Vancouver’s creative director Jeff White on some initial project bids, she was soon working on Disney’s Aladdin (2019). “The ILM Art Department is incredibly talented and is really the best of the best,” Richard notes. “There’s so much you can learn from them.” She continued as an art director on Space Jam: A New Legacy, for which ILM was responsible for integrating the classic Warner Brothers animated characters with live action footage. 

“There was a lot of artwork created at the beginning of Space Jam,” Richard explains. “The spirit of it evolved quite a lot over the course of the show. I had a wonderful team, and I really loved working on Bugs Bunny! [laughs] Grady Cofer had me doing paint-overs on some of the characters, which I really enjoyed. The whole team was involved in refining the final looks of each character, including the textures crew, the groom artists, the modeling team, and the animators. I’m always blown away when I see animation come through.”

It was after Space Jam that Richard made the transition to associate visual effects supervisor on Lost Ollie. “I’m a bit like the righthand person or wingman for the visual effects supervisor,” she elaborates. “We work very closely with production and our department leads and supes to help establish looks, refine shots, and execute what needs to be done in post to maintain a certain level of quality and consistency. I had been slowly navigating into an effects supervisor-type role for a while, but I wasn’t sure if I had all the skillsets to be able to do it. I talked to Jenn and Nak about it, and they were very supportive and helped to guide me into this position along with Jeff White and [executive in charge] Spencer Kent.

Lost Ollie (Credit: Netflix).

“I think I just got really lucky,” Richard continues. “I believe that Jeff had Ultraman in mind for me, but it wasn’t quite ready yet. [Visual effects supervisor] Hayden [Jones] and [visual effects executive producer] Stefan [Drury] were working with Shannon Tindle on Lost Ollie, so I had a chance to establish a relationship with the same client. I think that’s why they thought it might be a good starting point for me. It was a smaller project, and I love the hybrid between live action and CG characters. It’s probably what I’m best at and what I love to do the most. I ended up diving in heavily on two episodes, and then I stayed in the background on the final two because that was when I started transitioning to Ultraman: Rising.”

The move into supervision has allowed Richard to focus more on refining her approach to communication and collaboration between the artists and the clients. “On Ultraman, Hayden was great at encouraging the team to ask questions and offer up suggestions with Shannon,” she notes. “What’s great about Shannon is that he creates an environment where it’s okay to suggest something that might not ultimately be the right idea, but it’s great to put it out there and see if it works. [ILM executive creative director] John Knoll is very similar. He embraces that exploration and isn’t afraid to try something.”

Richard emphasizes that “part of being a supervisor is having an ability to read the room and understand the personalities of the artists and how they like to communicate.” And as an artist herself, Richard brings her own unique blend of experiences. “I’ve been lucky to have had a toe in the practical side of things very early on. I’ve also worked with some really talented people who come from an earlier generation of filmmakers. I hope that some of that knowledge translates in my communication with the artists. Both Grady and Hayden like to do quick paint-overs on things in dailies, and that’s something I like to do as well. If words don’t quite explain something, sometimes a quick drawing or paint-over can act as a visual reference. Many supes like to do that.”

As so many have attested, it’s the people that have truly made the difference at ILM in its 50 years of storytelling. “Have curiosity about the people you’re working with,” Richard says, “and have empathy for them. Try to understand where your colleagues may be at a certain point in time. You can use that to develop relationships throughout your career, which is so important.”

Ultraman: Rising (Credit: Netflix).

Read more about Richard’s work on Ultraman: Rising here on ILM.com.

Lucas O. Seastrom is the editor of ILM.com and a contributing writer & historian for Lucasfilm.



ILM teams from around the world earn recognition for projects as diverse as Wicked, Gladiator II, Ultraman: Rising, Deadpool & Wolverine, and What If…? – An Immersive Story.

Today, the Visual Effects Society announced their nominations for the 23rd Annual VES Awards, recognizing visual effects artistry and innovation in features, animation, television, commercials, games, and new media. Both ILM and ILM Immersive received 20 nominations in total. 

Nominations in the overall film and television categories include Outstanding Visual Effects In A Photoreal Feature for Twisters, Outstanding Visual Effects In an Animated Feature for Transformers One and Ultraman: Rising, and Outstanding Visual Effects In A Photoreal Episode for Star Wars: Skeleton Crew and The Lord of the Rings: The Rings of Power (Season 2). Additionally, Blitz was nominated for Outstanding Supporting Visual Effects in a Photoreal Feature.

In the Outstanding Visual Effects in a Real-Time Project category, ILM Immersive received a nomination for What If…? – An Immersive Story and the D23 Real-Time Rocket

ILM has received nominations in many other categories including Outstanding Environment in a Photoreal Feature for Rome in Gladiator II and the Emerald City in Wicked, as well as Outstanding Environment in an Animated Feature for Transformers One’s Iacon City. Alien: Romulus, Deadpool & Wolverine, and Gladiator II have each picked up nominations for Outstanding Model in a Photoreal or Animated Project, while Venom: The Last Dance joins Twisters with nominations for Outstanding Effects Simulations in a Photoreal Feature.

A complete list of all of the VES nominations may be viewed at this link. The VES Awards will be held on February 11, 2025, at The Beverly Hilton Hotel in Los Angeles. Congratulations to our ILM and ILM Immersive teams!

The visual effects supervisor talks Cassandra Nova, Gambit, and more.

(Credit: Disney)

In a surprise twist midway through Deadpool & Wolverine, our titular protagonists are marooned in the Void: a Mad Max-like wasteland of desert and forgotten heroes. Their time in this multiverse purgatory takes up a significant chunk of the movie and features many of its greatest moments, from surprise character appearances to action set pieces, and Industrial Light & Magic was charged with bringing it all to the screen. Just from reading the script, visual effects supervisor Vincent Papaix knew that this section of Deadpool & Wolverine would be key to the movie’s overall success.

“Everybody was super motivated,” he tells ILM.com, “and we all knew that this movie was going to be special.”

Deadpool & Wolverine went on to become the biggest R-rated movie of all time following its theatrical release this summer, a true cinematic event during a challenging time for the film industry. To mark its arrival on Disney+, ILM.com caught up with Papaix to discuss how ILM realized some of the blockbuster’s impressive visual effects in the Void sequences. Grab a chimichanga and join us.

(Credit: Disney)

A sunny day in the Void

As fantastical as the Void may sound, director Shawn Levy and star/producer Ryan Reynolds (Deadpool) aimed to make it a believable setting and something that fans could relate to. “They really wanted it to feel as grounded and real as possible,” Papaix says. To achieve this, the filmmakers started the old-fashioned way, more or less.

“One thing for them was to shoot in natural conditions. That’s why most of the shoot was outdoors,” Papaix says. “So they shot in a landfill in the UK and in various locations in the UK to get that kind of natural light feel.”

This did create certain challenges, however. “You don’t control the elements,” Papaix says simply.

When Deadpool and Wolverine (Hugh Jackman) first arrive in the Void, they quickly proceed to beat the tar out of each other. This fight was shot outdoors in summer 2023 and then, due to the writers’ and actors’ strikes, finished in winter 2024. As a result of the pause and change in seasons, the color of the sky appeared slightly different. Though Levy and Reynolds initially hoped to digitally correct any inconsistencies in the look of the sky, ILM encouraged the filmmakers to keep this to a minimum. “One thing that was great about working with Shawn and Ryan and [Marvel visual effects supervisor] Swen Gillberg was that they are very collaborative,” Papaix says. “We did a few shots and a few tests and we realized the best outcome was to embrace the plate. So, based on the plate, if it’s sunny, let’s try to augment that. If it’s stormy, let’s try to be more stormy and then we’ll look at how it plays in the cut. And every shot was kind of hard to direct in that way. It’s making sure that it plays nicely, but if you look at the sequence, there’s some variation as you would have in a natural daylight. You can be in an area and within 10 minutes it can be from sunny to cloudy to stormy, depending on what is happening. So we focused on making it look as real as you can.”

Finally, to increase the grandeur of the Void, from scale to background elements, ILM came in to digitally augment what was captured in-camera.

“Our work was focusing on creating a seamless transition from the foreground set to a CG extension of the Void,” Papaix says. “Overseeing adding everything that was needed to the Void, including the detritus. There are all those different objects scattered throughout the Void. So obviously [we were] making sure they integrated, but the Void needed to feel real, and not feel like the foreground was on a stage in bluescreen extended into a CG world.”


A new villain emerges

The evil twin of Charles Xavier, Cassandra Nova (Emma Corrin) debuts in Deadpool & Wolverine as the ultimate authority in the Void. Unpredictable and hugely powerful, she’s a frightening villain that Wolverine and Deadpool must overcome. With guidance from Levy and Reynolds, ILM set out to illustrate her abilities in a subdued but unnerving way.

“She can control a lot of things with her mind,” Papaix says. “They wanted something fairly subtle to not overpower what was the power. It was important to show what it was doing to the people and not too much to [show] the power itself, not too much magic or anything. So it was more a subtle distortion to explain that there’s something happening.”

And what Cassandra does is indeed creepy: She seems to have a predilection for passing her fingers through the skulls of her enemies, including the Merc with a Mouth.

“We went through different aspects, from being creepy and caressing his face with almost spider-like fingers. All that was digital and a very complex simulation to kind of deform the masks in CG. What gets tricky is that it’s easy to do a collision, but we had to do a half collision and half penetration going through. So that’s actually a very complex simulation to control. And it was fully art-directed, meaning we had to control every aspect of the effect. We started with the performance of the fingers, and once we had the right emotion, then we worked on the simulation of how the mask should deform and, at the same time, kind of breaks open to let the finger go through.” In the end, Papaix was more than happy with the result.

“I read a lot of great reactions. People felt an itch, a little bit. It feels creepy but in a good way, because that’s definitely what the filmmakers were after.”

(Credit: Disney)

Johnny Storm’s quick exit

Cassandra sends Johnny Storm (Chris Evans reprising his original Marvel role from the Fantastic Four films) to a truly unfortunate demise, ripping off his skin and driving him into the ground. It’s a shocking moment—gruesome with a dark sense of humor—in a movie full of them.

“This was part of the script from day one,” Papaix says. “That was a moment that was very important for the filmmakers.” But where to begin for an effect so unlike anything previously seen in a Marvel movie? “Ripping out the skin was very graphic, so we had to study images.”

ILM turned to Real Bodies: The Exhibition, a long-running museum showcase that features actual human specimens, for reference. It made for a decidedly unique creative process. “The real [body in the exhibit] is very dry and has been preserved. We wanted to make it look fresh, so we had to add a lot of blood and liquids to make sure we felt that this just happened. So we are dripping blood, dripping fat. That was very gross. The daily session with the artists was always interesting.”

Once ILM knew how the effect should look, they began building a digital Johnny.

“The way we proceeded with this was creating an asset,” Papaix explains. “So a skeleton asset, we called it, with all the flesh and all the organs in there. We based everything, all the proportions, on Chris Evans. We have his scan. We created a digital version of Chris for Johnny Storm, even for the Human Torch version when he was on fire.”

Then it was time to get down to the de-skinning business.

“So we started from that and then we ripped off his skin. It’s pretty much what you can imagine, but in CG,” Papaix says with a laugh. “The shell of the clothes and skin were removed, revealing the skeleton with all the flesh. We tried to create some strings of blood coming out of him.” In an effort to maintain the series’ comedic tone, ILM added some elements to hopefully make this scene a little more Looney Tunes and less Hellraiser.

“It was kind of a cartoony moment, but in a good way — he has that moment blinking his eyes, and it’s like, ‘What just happened to me?’ And then he drops.”


Gambit gets his day

The Void segment culminates with Deadpool, Wolvie, and a band of fan-favorite heroes launching a siege against Cassandra and her forces. While fans delighted at seeing each back in action, one required visual effects that are essential to the character.

“A lot of attention was put to Gambit [Channing Tatum],” Papaix says. “We studied a lot of the comic books to see what was happening with his cards and [mutant power].” In the comics and iconic X-Men cartoon, Gambit charges playing cards, resulting in a purple glow; when he tosses them, they leave a trail and explode on impact. “We went for a various range of showcasing the power to the point that I remember a version where we probably went too far — too glowy and too flamey-looking. And that’s something that was not pleasing to Shawn, for good reason. He wanted to be grounded, again, to reality. So the cards — it’s the X-Men and all, but it’s important to have the cards telling the story.” 

As a solve, ILM illustrated a slower buildup of Gambit’s mutant power. “We were focusing mostly on the card and the energy within the card. There was a closeup in the cavern, when you see the card activating, and it’s within the pattern of the card. For the battle, we made some trails to be able to see it, because a card is very small. True to the comic.”


A lasting collaboration

Deadpool & Wolverine is a success for Papaix on several levels, from the commercial and critical reception to more personal reasons.

“I had the chance to work on the first Deadpool in 2016. Time flies. So this one already was quite special in my career, and having the opportunity to supervise the third one was also quite special. Knowing that Hugh Jackman was attached as Wolverine, there were so many good things.”

But looking back at the film, he seems to mostly value the time with Levy, Reynolds, and Marvel. “They were great collaborators. Obviously, he’s a director and he makes his call, but he was very keen on hearing people’s suggestions. But the collaboration for me is one of the highlights of the show with Swen and with Marvel, and pitching those ideas to Shawn and Ryan. They also thanked us. We know that’s something that not every filmmaker does, but at the end of the project we got a thank you video from Ryan and Shawn to share with our team at ILM, and it’s always fun to see that they appreciate the work. Obviously, they see the people on set, but when you do post-production, they receive the image. So they don’t really realize that we were 275 people making this happen. We did about 30 minutes of the movie, 614 shots, but it was a global team. It was mostly Vancouver and San Francisco, but also other ILM sites working with us. But it was 275 people. That’s quite a big group of people making it up to show those crazy visual effects on screen.”

(Credit: Disney)

Dan Brooks is a writer who loves movies, comics, video games, and sports. A member of the Lucasfilm Online team for over a decade, Dan served as senior editor of both StarWars.com and Lucasfilm.com, and is a co-author of DK Publishing’s Star Wars Encyclopedia. Follow him on Instagram at @therealdanbrooks and X at @dan_brooks.

The visual effects supervisor sheds light on the process behind developing the series’ fantastical realm.

The Two Trees of Valinor in a shot from Season One (Credit: Amazon).

Based on the work of J.R.R. Tolkien, Prime Video’s The Lord of the Rings: The Rings of Power (2022-Present) stands out as a sweeping epic taking place during Middle-earth’s Second Age, a period which predates the timeframe depicted in The Lord of the Rings (2001-2003) and The Hobbit (2012-2014) film trilogies by thousands of years. In order to bring a cinematic scale to the streaming series, The Rings of Power team enlisted Industrial Light & Magic’s own Jason Smith to be the production visual effects supervisor.

With a formidable résumé that includes work on everything from Star Wars: Revenge of the Sith (2005) and Transformers (2007) to The Revenant (2015) and Bumblebee (2018), Smith brought his accumulated expertise to bear as he guided ILM and the series’ other visual effects studios on their journey through seasons one and two. Smith graciously took some time to speak to ILM.com in order to discuss his work on The Rings of Power and highlight ILM’s extensive contributions to crafting showrunners J.D. Payne and Patrick McKay’s vision for Middle-earth.

An Epic Endeavor   

“As the senior visual effects supervisor on the project, my job is to work with the showrunners to help them put their story visually on the screen,” Smith tells ILM.com. This monumental responsibility encompasses hiring vendor studios and planning out the work, being on set and instructing the shoot during principal photography, and shepherding the shots through post-production. “So it kind of includes everything. It’s a real privilege because, on this project, I was one of the first handful of people who joined. We got to take on some challenges in Middle-earth that hadn’t been done yet, like Númenor and the living Khazad-dûm.”

With a background in blockbuster films, Smith has witnessed how studios’ approaches to episodic and theatrical productions have grown more alike in recent years. “The bar has been raised by so many great shows over the last 10 or 15 years that the expectation is that [series] quality is film quality,” Smith observes. Nevertheless, there are notable distinctions involved in making a series that Smith also treasures, such as the approval process and the collaborative spirit of the showrunners, in this case J. D. Payne and Patrick McKay. “There’s quite a lot of freedom to throw out ideas, to suggest things, and to help to guide the visual storytelling with those guys. Our work with the directors is more about prep and being on set, truly. So that’s a key difference there. It leads to a little bit more autonomy in post, which I really appreciate and love.”

Such a rapport is essential, especially given the volume of visual effects shots associated with The Rings of Power. “Nothing can prepare you for the number of shots on one of these shows,” Smith shares. “Because let’s say you’ve got 6,000 shots, and at the peak, maybe you have half of those in play. If you just imagine what it’s like when 3,000 shots have one-minute reviews crossing your desk, if you do the math, it catches up with you very, very quickly. It turns the whole thing into this ballet, and it gives me a lot of respect for the production teams at the vendors and here [at Amazon]. It’s astoundingly good, they keep us all organized in spite of ourselves.” Smith estimates that, in the first season, ILM worked on around 1,000 effects shots under the supervision of Nigel Sumner. “ILM was our lead house on season one and did a lot of work, along with Wētā [FX], Method [Studios], and others.”

Season Two concept art by Saby Menyhei (Credit: ILM & Amazon).

ILM’s Innovations

A capacity for collaboration proved to be a vital strength demonstrated by Smith and ILM, particularly when it came time for Mount Doom to unleash a volcanic eruption and transform the Southlands into the infamous region of Mordor. “ILM played a huge role in the volcano. We went into that with complete ownership of it,” Smith recalls. The volume of shots that came in for the sequence necessitated that the work involve three vendors. “We had Weta doing some really big pyroclastic cloud shots that cut back to back with ILM shots, and ILM and Weta both cutting back-to-back with RSP [Rising Sun Pictures] for some pyroclastic clouds and raining lava bombs.”

The challenge of coordinating with other studios resulted in a clever solution whose ingenuity was defined by its simplicity. “We treated all three companies like they were the same company, no secrets, sharing everything fearlessly,” Smith states. For the vendors, sharing their works-in-progress amongst one another provided a common frame of reference for the elements they were bringing to life, whether that be the darkness of a cloud or the intensity of an ember. “Everybody pulled it off amazingly well,” Smith beams. “Most people who see [those scenes] would never actually detect the different styles in play, as those shots cut back-to-back. It’s a testament to those teams being willing to work together.”

Early in season one, the scenes involving Galadriel (Morfydd Clark) and Halbrand (Charlie Vickers) adrift on the Sundering Seas required yet another imaginative remedy from ILM. “When we read the scripts for season one, an amazing amount of it took place on the ocean. So I immediately wanted to go to ILM with that work,” Smith details. “They created a setup — we called it a ‘machine’ so people outside of visual effects could sink their teeth into it – it was a very robust procedural ocean — that let them tune the [water’s] parameters, like the wind and height of the waves. [ILM visual effects supervisor] Nigel Sumner was leading the charge on that.” Smith was impressed by the “Ocean Machine,” as it allowed the team to turn out photoreal plates once the scene’s parameters were set.

However, Smith adds that “of course, there’s artistic work involved in every single one of those [plates]. In our show, every time the ocean is there, it’s telling a story, so we had adjustments to make across the board. At the extreme end of the spectrum , there was a gigantic storm – one with an epic Middle-earth scale – but all of that stuff came out of that same engine with the same artistry behind it.” Smith teases that ILM’s season one advances also paid dividends for water-related shots in season two. “I will say that, without knowing exactly what’s coming — except for having read Tolkien’s books [laughs] — I would anticipate that we should be getting a lot more use out of the ocean machine.”

A Season One shot depicts Galadriel and Halbrand aboard wreckage on the Sundering Seas (Credit: Amazon).

Season Two Sensations

After praising ILM for its marvelous work across both seasons, the latter of which was supervised by Daniele Bigi, Smith shifts his focus to his favorite contributions from ILM that have shown up in the early episodes of season two. For instance, the show’s sophomore effort affords us an opportunity to meet the Elven elder, Círdan (Ben Daniels), who resides on an idyllic shoreline. “There’s a matte shot, an environment shot, pulling back from Círdan’s boathouse at the Grey Havens, and the work is just so photographic-looking,” Smith elaborates. In an interesting twist, the tranquil locale was actually built upon a set standing in the parking lot next to an industrial park. “The water is flawless, the fjords are so beautiful, the lighting is wonderful, and the composition is great. That is one [shot], when people see it, we already have people asking, ‘Wait, where did you build that?’ They were wondering where that was [in the real world], so I think that’s a testament to the artistry at ILM.”

Smith’s gratitude for ILM’s craft extends to one of season two’s most complex sequences. The premiere begins with a flashback to Sauron’s apparent death, when his rival Adar (Sam Hazeldine) orchestrates a coup against him. Initially reduced to a viscous goo that oozes into Middle-earth’s cavernous depths, Sauron’s enduring form subsists on passing rodents and insects until it is able to free itself from its tomb. The squirming coalescence gradually makes its way toward civilization, consuming a traveler and allowing Sauron to assume the human form known as Halbrand. “ILM had the guts to take on one of the weirdest things in the whole show, and I mean that with the most excitement that I can communicate,” Smith says in regard to the eerie endeavor. “It is such an incredible challenge of simulation, and creature rigging, and complex animation, and rendering, and basically every single discipline. [ILM] really hit the ball out of the park with that.”

Despite these fantastical qualities, Smith sees the importance of grounding such creatures with the properties of real-world references. “We found a species of worm that formed colonies and that will contract almost like a muscle themselves, and we combined that with reference of horsehair worms, which are really unsettling in their own way. The final result is really beautiful to watch, and gut-wrenching in a way. It looks like animated SpaghettiOs. We’re always looking for references like that, because those are the little handholds into reality that take these creatures to the next level.” Storytelling is another key factor in ensuring that the visual effects showcasing Sauron’s resurrection resonate with audiences. “It’s so intertwined with the story, it’s exactly the story that needs to be told about that character. It interacts with an animal, too, which I think is also beautiful work. That’s one of the things I’m excited for people to see.”


The Wonders of Worldbuilding

Smith’s passion for Middle-earth and its wealth of creatures can be traced back to his childhood. “When I was growing up, the world of Tolkien was a huge deal to me. I loved it,” Smith explains, believing Peter Jackson’s movies to be some of the best ever made. “They really captured my heart. And, being a creature kid at the same time, it’s a match made in heaven.” Citing a popular Tolkien quote — “I wisely started with a map, and made the story fit” — that originated in a letter penned by the writer, Smith emphasizes the need to start with the realm’s foundation, whether that be its geographical details or the intricacies of a fictional language. “When we’re doing fantasy, especially when it’s a little more imaginative, we have to have that anchor.”

From hypothesizing how Ents emote to his own attachment to the hill-troll, Damrod (voiced by Benjamin Walker and Jason Smith), Smith reflects back to the moment he learned he’d get the chance to deal with more creatures in season two. “Patrick McKay said to me that, this season, we’re going to battle a lot more creatures. Things are getting darker, and we have a story-based need to really bring some of these other creatures into play.” A reverence for the natural world was also key for McKay and Payne, as their desire to remain enveloped in the story meant that any magic displayed in the series could not have too much, as Smith puts it, “razzmatazz.” Smith recounts that the magic of the first season involved beings who manipulated flames, kicked up winds, and threw rocks about. “It’s almost like the beings who are working with the magic have access to another chapter of the laws of nature, so what looks like magic is their interaction with more advanced, but still physical and grounded, laws,” Smith postulates.

“It’s all caught up into these elemental ties,” Smith continues. “Even on set, when we’re filming scenes with magic, Daniel Weyman, who plays the Stranger, and I would stand there and pick each others’ minds a little bit. [I’d tell] him, ‘I think that to increase the threat, I’m going to have rocks picked up in the wind, so that when you’re thrown against the wall, some rocks are hitting all around you. It makes it look like it really hurts.’” Weyman reciprocated the brainstorming sessions, and Smith himself remains grateful that he’s able to bring his ILM experience to the table. “I feel really lucky that I got into ILM render support when I did, and that I’ve been able to work with the people that I have at ILM. To be mentored by people like [ILM visual effects supervisor] Scott Farrar,  [ILM executive creative director] John Knoll and [ILM consulting creative director] Dennis Muren is a real privilege, and I take a lot of the wisdom those guys have said over the years into all of these different interactions and challenges.”

The Balrog from Season Two (Credit: Amazon).

Travels and Triumphs

Collaborating with the showrunners and cast members is not the only one of Smith’s duties as the show’s senior visual effects supervisor that audiences may be unaware of. “I shoot all the aerials for [The Rings of Power],” Smith reveals. “We’ve got some drone stuff that happens without me, but so far, I’ll say, I’ve shot all the helicopter work for the show. Which is great, because then when we’re designing shots, I have the shots in my head.” Traveling around New Zealand and the United Kingdom, Smith found real-world sights — frozen rivers, tundras, green valleys, cliffs, waterfalls, and more — that fit with the series’ vision and could be inserted into its story.

“If you look at the big CG [computer-graphics] establishing shots, almost without fail, they are all based on those aerials,” Smith declares. ILM handled such a shot in the season two premiere, “Elven Kings Under the Sky,” in which High King Gil-galad (Benjamin Walker), Galadriel, and Círdan don their rings of power under the cliffside tree. “We see a final wide [shot] of that [scene]. That is just a plate we shot not knowing what exactly would be placed there [laughs], and I think ILM did beautiful work of making it look like it was all on purpose the whole time, placing the court of Lindon there at the top of the cliffs.” Smith is thankful for the professional atmosphere fostered by the showrunners and producers which permitted him to be a part of those creative solutions. “There’s an openness to solutions presented in service of the story. There’s a healthy openness to improvements regardless of the source. I think that creates an environment that’s a lot of fun to work in. It’s very busy, but it’s also incredibly rewarding.”

A wide shot from Season Two (Credit: Amazon).

A Philosophy for Visual Effects

Smith’s overall approach to the visual effects of Middle-earth has been greatly influenced by his work with The Rings of Power season one producer Ron Ames and their colleagues. Smith understands that there are common misconceptions about what visual effects experts bring to the process of filmmaking, alluding to the laughter among crew members that tends to arise when on-set visual effects teams bring out the gray reference balls utilized for computer-graphics shots. “When we’re doing the visual effects, or we’re doing previz [previsualization], we’re making the movie. We’re not delaying things with previz, we’re making the movie,” Smith asserts, describing visual effects as no longer being a process that you stamp on at the end, like a postage stamp being affixed to an envelope. “Visual effects is the pulp that’s tying the paper together. It’s touching everything, it’s influenced by everything, and it really determines so many different ways that you can or can’t tell the story.”

Serving as The Rings of Power’s senior visual effects supervisor has made Smith excited for what is on the horizon for his profession. “This project has been eye-opening to me. It’s incredible the things that we can do and add to make the story more fun. There’s a bright future in front of us where we’re going to see the lines [between the story and visual effects] blur more and more, and that’s actually a good thing. I think people are going to understand that visual effects are there, and that it’s okay [laughs],” Smith assures. “Visual effects do exist, and it’s alright, everyone.”

Centering his thoughts on the step in the visual effects process that he hopes will become more widely embraced, Smith circles back to the writhing creature that emerged from Sauron’s corpse in the season two premiere’s flashback sequence. Stressing that the use of reference by visual effects artists is not just for “help-in-case-you-get-stuck” situations, Smith theorizes that the right reference can elevate the efficacy of a visual effects team’s sheer talent. “And I’ll tell you what, ILM has talent. We have the top talent in the world. But, what I’ve learned is, even with that top talent, if you spend time and effort getting the right reference, then your talent is picking up the baton at maybe 90% and having to carry it to 100%, instead of picking it up at 20% and having to carry it, gasping for air, to 90 or 95%. That’s the truth of it.”

Smith’s outlook on his duty to The Rings of Power and its fans is encapsulated in his final remarks. “These projects [have] finite time schedules, and visual effects is a finite resource. That’s where I see my job. I want to make sure that every visual effects artist that’s on the show is really putting pixels on the screen that matter, that are telling the story,” Smith concludes. “I think that my value is trying to emulate Tolkien in laying the best foundation of reality for our audience that we can. Finding as much in our world to resonate with Middle-earth as we possibly can, just like he did with his writing. When he wrote about his big monsters, sometimes it’s just a spider that’s big. I think we’ve been able to have some fun with that, too. We have one creature in the mud here that was done by ILM in the second season that is along those lines. I think some people will recognize the shared DNA with a small but brutal predator from here on Earth when they see it, and the result is not something I’d want to encounter. We try to learn from the way that [Tolkien] approached things and take that on as a mantra.”

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Industrial Light & Magic brings creative commotion to the creatures and New York cityscape of A Quiet Place: Day One.

Alex Wolff as “Reuben” in A Quiet Place: Day One (Credit: Paramount Pictures).

Surviving the extra-terrestrial terror of A Quiet Place: Day One (2024) depends on the critical ability to stay absolutely silent.

Setting the third installment of the acclaimed film series in noisy New York City, however, brought an entirely new level of fear to the post-apocalyptic horror world first introduced to audiences in John Krasinski’s A Quiet Place (2018), while simultaneously presenting a welcome challenge for the visual effects team at Industrial Light & Magic.

ILM visual effects supervisor Malcolm Humphreys says early discussions with director Michael Sarnoski focused on how to bring unique and unexpected aspects to the frightening alien invaders that use a preternatural sense of hearing to stalk their human prey. 

Among the thousands of New Yorkers running for their lives is Sam, a terminally-ill cancer patient played by Academy Award-winner Lupita Nyong’o. Trying to escape the city as the monsters close in, Sam and her cat Frodo eventually encounter Eric, an English law student portrayed by Joseph Quinn. Sam is determined to get a slice of her favorite pizza before she dies.

“He wanted to make a narrative about how two different people deal with this situation in a big city,” Humphreys says of Sarnoski. “So this was an interesting take about trying to make something about two strangers that meet while all this chaos is happening.”

Concept art by Szabolcs Menyhei (Credit ILM & Paramount).

A visual effects veteran of films including Ant-Man and the Wasp: Quantumania (2023), The Batman (2022), and Star Wars: The Rise of Skywalker (2019), Humphreys and his team helped guide Sarnoski and cinematographer Pat Scola through the complex process of making a film that required a large number of visual effects sequences. Sarnoski and Scola previously collaborated on the award-winning film Pig (2021) starring Nicolas Cage. 

“Part of the job at ILM is just understanding the story and where we want to go, and just trying to build a bespoke solution depending on the different types of shots we’re doing,” Humphreys tells ILM.com.

One challenge, Humphreys says, was determining how the creatures with hypersensitive hearing might move and behave in a city environment like New York. 

“In the previous films, they’re either just stealthing on a single character or they’re sort of doing a snatch-and-grab,” explains Humphreys. “So Michael was very keen on expanding that a little bit more. For example, ‘how do they act with each other?’”

During a nighttime sequence set at a construction site, the creatures behave almost like a family gathering for dinner, ripping apart and devouring a fungus-encrusted pod for food. Behind the scenes, the ILM team came to refer to the monsters by the name “Happy.” 

“They’re not very happy creatures, so calling them ‘Happy’ is kind of fun,” Humphreys says. “There’s a really big mom that’s all caked in white, and then you’ve got the little baby happies. The little ones have slightly bigger heads. They’re smoother.”

(Credit: Paramount)

When Eric accidentally makes a noise, a nearby creature is alerted and exposes its slimy, pulsating inner ear to listen more closely. It’s a tense, relatively long shot that Humphreys says is also one of the film’s most complex.

“There’s an immense amount of detail that the modelers, the texture artists, and the effects artists have done,” he says. “There’s the eardrum that’s fluctuating. You’re actually hearing Eric’s heartbeat, and we’re pulsing the eardrum and the heartbeat together.

“You want to get an emotional reaction from the audience, so we want to sit on this shot for quite a while,” Humphreys continues. “I really, really love this shot.”

Humphreys credits animation supervisor Michael Lum with helping develop the right movement for the creatures as they do things audiences have never seen before, like scrambling up and over Manhattan buildings.

“All of the creatures are hand-animated,” Humphreys reveals. “There’s no crowd system or anything like that. They’re all handcrafted, which is amazing.”

Building out New York City was another major aspect of ILM’s work on A Quiet Place: Day One that may not be apparent to many audiences, and that’s exactly the goal.

The areas of New York that appear in the film— including the Lower East Side, Chinatown, Midtown, and Harlem—were realized as a massive partial backlot set built at Warner Bros. Studios Leavesden near London. Production designer Simon Bowles and his team built two intersecting streets that could be modified and dressed into new locations as Sam, Eric, and Frodo make their way through the city. 

Most of the backlot structures, however, were only built two stories tall, requiring ILM artists to digitally extend the height of buildings, lengthen streets, and fill in backgrounds. 

Lupita Nyong’o as “Samira” in A Quiet Place: Day One (Credit: Paramount).

“We did an immense amount of data capture,” Humphreys explains, a process that required 14 days in New York so the team could scan and photograph more than a hundred real buildings in high resolution. “We go through a whole process of building out those facades so that they can be used on many, many shots.

“For certain bits, we’ve changed quite significantly what you see in the backlot set,” Humphreys reports. “There’s a huge amount of augmentation and replacement.”

While Frodo the cat is entirely practical (played by two different feline stars, Schnitzel and Nico), a scene requiring the animal to weave through a frantic crowd running from the aliens required extensive digital artistry from ILM.

“Michael was adamant that he wanted to use the real cats,” Humphreys recalls. “There was a little bit of, ‘how are we going to do a shot like this? We can’t have a whole lot of people trampling over a cat.’”

The solution was to photograph just the cat’s performance separately at first, then add people and additional elements later.

“That shot is actually an amalgamation of hundreds of layers of different crowd people, and really timing and trying to build that shot up so that as an audience member, you get the sense of the chaos, but you also see Frodo enough for him to register,” Humphreys adds.

The film’s finale has Sam, Eric and Frodo desperately trying to reach a boat on the East River filled with survivors making their escape from New York. The sequence is built from several different locations, including part of an airfield dressed as a deserted FDR Drive, a pier along the Thames river, a moored boat, and a water tank at Pinewood Studios.

(Credit: Paramount)

“It was a lot of fun, but a lot of moving pieces,” Humphreys laughs. “We’re sort of shooting component pieces and hoping that they all go together.”

Humphreys says his favorite visual effect is the very last scene in the film. As Sam walks down a Harlem street listening to music, the camera sweeps 360 degrees around her in a single shot lasting nearly 40 seconds. Originally shot on the backlot, Humphreys notes the sequence required complex rotoscoping and compositing, with artists ultimately replacing as much as 70 percent of the original background with images created using the data ILM gathered in New York.

“We actually captured three or four blocks of Lexington Avenue, so there’s a huge amount of data capture for that one shot,” Humphreys says. “I’m really proud of that one.” 

Humphreys joined ILM in 2016 and is based at the company’s London studio. But he says the work on A Quiet Place: Day One was a truly global effort.

“I got to work with a lovely team in Vancouver, in London, Mumbai, and San Francisco,” he says. “I think we’re just good creative partners.

“The one thing you get out of ILM,” Humphreys believes, “is that it still operates very much like a smaller company in terms of communication and collaboration, which is really refreshing.”

Concept art by Daniel McGarry (Credit: ILM & Paramount).

Clayton Sandell is a television news correspondent, a Star Wars author and longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design.

Heads still roll 25 years later in the Tim Burton classic Sleepy Hollow (1999). Revisit all of the eerie magic behind Industrial Light & Magic’s work that brought Washington Irving’s folktale to life and reintroduced audiences to one of cinema’s greatest on-screen monsters, the headless horseman.

By Adam Berry

The headless horseman pursuing Ichabod Crane in the Western Woods. (Credit: Paramount)

On October 5th, 1949, the Walt Disney Studios released a feature film that reimagined two classic pieces of literature through the guise of The Adventures of Ichabod and Mr. Toad (1949). While the first half retells the whimsical story of Kenneth Grahame’s The Wind in the Willows (1908), it is the second half that left a long-lasting impression on young audiences as they were introduced to American writer Washington Irving’s eerie folktale, The Legend of Sleepy Hollow (1820).  

Released just in time for Halloween that year, this feature would go on to be recognized as one of Disney’s classics due to its memorable songs, beautiful animation, and the unforgettable visualization of Irving’s ghostly antagonist, the headless horseman. The unsettling imagery of a headless man riding horseback with a sword in one hand and a flaming jack-o’lantern in the other allowed the legend to evolve as film versions were passed down to new generations. 50 years later, The Legend of Sleepy Hollow would once again evolve on screen, and explore the story in new ways, re-introducing audiences to Irving’s horrific tale of the undead horseman.  

To take this classic into a tangible world, a highly imaginative and visual mind was necessary to capture the fantastical elements of this story while rooting it in a sense of reality. Tim Burton, director of such films as Beetlejuice (1988) and Edward Scissorhands (1990), was keen to step in as he was a fan of the Disney film. Burton told American Cinematographer, “I  was really familiar with the original story because I’d seen the Disney cartoon…. I actually didn’t read the source novel until after I had read the script.” Burton’s own history with Disney, including attending the California Institute of the Arts on a Disney scholarship, and working at Walt Disney Studios as an animator on projects such as The Black Cauldron (1985), destined him to take on the challenge of creating a fresh retelling of Sleepy Hollow.

Concept art by Scott Leberecht depicts an eerie atmosphere shrouded in fog on the road to Sleepy Hollow. (Credit: Paramount & ILM)

Burton’s vision was to create a fantasy world that felt real in which the headless horseman could exist. The aesthetic needed to emulate, but not copy, the atmosphere of the classic Hammer Studios horror films such as Dracula (1931) or Frankenstein (1931) with their moody and gothic tones that left the audience in a state of unease. With that being said, Italian director Mario Bava’s Black Sunday (1960) was the core inspiration for the film, giving Sleepy Hollow (1999) a classic movie feel while adding elements that were pictorial and synthetic.

To achieve his vision of what Sleepy Hollow needed to look like, Burton knew there had to be a balance between the use of traditional special effects and digital visual effects. “Digital technology is very interesting and certainly has its place in filmmaking, but when you’re watching a movie like Black Sunday you really feel as if you’re there,” said Burton. While he was resistant to using visual effects at first, he relied on the artists at Industrial Light & Magic (ILM) to help realize the full scope of his vision, particularly when it came to bringing the headless horseman into existence as a living, breathing creature. 

ILM visual effects supervisor Jim Mitchell was tasked with solving how the horseman could exist in the film as a real man without having to rely on older methods. Tricks like having the coat propped up on the actor’s shoulders didn’t work as the proportions were wrong, eliminating the appearance that the horseman was indeed a man. Mitchell said, “Tim and I knew that something just wasn’t going to be right with that approach. We eventually decided that our Headless Horseman would be an actual person riding the horse and flailing his axe around, except that we’d just digitally erase his head.”

The most complex shots for ILM involved removing the horseman’s head. There were about 300 horseman shots altogether, with ILM creating 220 and London’s Computer Film Company contributing the rest. To convincingly convey that the horseman was real, the ILM team innovated a special blue hood for stunt actor Ray Park to wear during action sequences that the ILM artists could later isolate and erase from the shot. Blue was used as it is easily keyed from the plate so the effects team could restore the background in place of the head.

The headless horseman claims his next victim. (Credit: Paramount)

To fill the space where the actor’s head was taken out, a clean background plate of the sequence was shot, but the artists noticed that one element was still missing as the horseman has a large cape with a collar around his neck. “It was not only necessary to replace the background where his head would have been, but to also make a digital collar in the computer that was then matched to his movements,” shared computer graphics artist Sean Schur in Paramount Pictures’ Behind the Legend documentary. By using actors and replacing heads digitally, the horseman presents as a living and breathing creature. 

Achieving this effect was particularly challenging during fight sequences with actors Johnny Depp and Casper Van Dien as their faces would be blocked out by the horseman actor’s blue hood. Once ILM erased the horseman’s head, they would also have to go back and eliminate the other actors’ heads as well. Mitchell shared, “I would have Johnny or Casper go through the same actions without the Horseman in there, and we’d just put their head into any frames where the horseman’s head was blocking theirs. It’s a tricky process, but it was actually pretty effective.”

Equally as challenging were the beheading scenes throughout the film. Creature effects artist Kevin Yagher created prosthetic heads of the actors for use in these pivotal moments while ILM was able to digitally recreate the scene using a series of three plates to blend together and form one cohesive shot. Using the scene in which the menacing Lady Van Tassel (Miranda Richardson) decapitates her bewitching sister as an example, Richardson would be filmed going through the motions of swinging her axe to dead air, then the prosthetic head flying off the body of her sister would be filmed separately, and finally the digital capture of that scene would be created. Once all three plates were finished, ILM would blend these together to make a seamless sequence for each beheading, making it feel all the more real for the audience. This was not an easy task for the artists as the film has ten decapitation sequences.

Concept art by Scott Leberecht shows ILM’s approach to depicting the headless horseman’s return to hell. (Credit: Paramount & ILM)

Burton wanted to convey suspense and a sense of impending doom throughout the film and tasked ILM with a series of subtle visual effects shots that added to the unsettling feeling when the horseman would appear. Most notable is the disturbing scene where the horseman pursues a family at their home. Killian (Steven Waddington) sits at his table with a crackling fire, which spontaneously erupts into larger flames seconds before the horseman crashes down the door. Sequence Supervisor Joel Aron shared, “I took the skull, which is the headless horseman’s skull, so I pulled up the eyebrows giving it this demonic look with a strong forehead, curling up the corners of the mouth and bringing the jaw around to continue to sculpt what would be the fire so that I knew when the fire would come off it would have an irregular shape.” It’s a blink-and-you-will-miss-it effect, but if you look closely you can see 13 demonic faces emerge within the flames in a quick flash which is meant to indicate that evil is present. It’s so subtle that it was intended for audiences to question whether they really saw the faces or not. 

Natural elements were also added and utilized to punctuate the horseman’s presence. The subtle introduction of thick fog and flashes of lightning appear every time the horseman gallops toward his next victim in pure cinematic fashion. Sleepy Hollow was shot mostly on location in a small town called Marlow, just outside of London, which meant the environment presented Burton with an ideal setting for the gloomy atmosphere. These elements could be viewed by some as cheap tricks in a major film but the use of heavy smoke for fog makes the atmosphere more haunting and interesting. “In the Western Woods set and at some of the other locations, you can definitely see the smoke – it looks like the fog they used in the old Frankenstein and Mummy movies,” said director of photography Emmanuel Lubezki.

While using smoke allowed the filmmakers to get a consistent movie look, it presented challenges for the ILM team as once they were finished adding actors’ heads back into the shots they would also have to build back in the foggy backgrounds and natural elements in each scene. “The big problem for us was [that] every shot involving the horseman also had lightning and fog,” explained Jim Mitchell, “which was constantly moving and always changing, as opposed to trees and buildings, which are rigid. Whenever lightning hit the Horseman, we had to make sure that when we replaced his collar or any other parts of his suit that his head was blocking, we put the same lighting effect on it.”

To simplify this, Mitchell asked Burton and Lubezki to shoot the scenes as though the actors’ heads were already removed despite the level of complexity it would add for the ILM team to ensure the elements moved organically with the actors as they rushed through the fog, or horse hooves galloped through the settled leaves on the ground. “There are all kinds of things we’d prefer to stay away from when we’re doing this type of work, but if you lose those [atmospheric] touches, all of a sudden it’s not the same sort of visual, and it doesn’t have the same power,” concluded Mitchell.

The headless horseman emerges from the tree of the dead. (Credit: Paramount)

This is especially apparent during a highly intense scene mid-film when the protagonists discover the tree of the dead, which is the horseman’s resting place and gateway to hell. The combination of the natural elements like fog and tree leaves with digital effects cemented the believability of this scene as the horseman enters from the base of the tree in a bloody and terrifying fashion. There were multiple plates used to build this effect. Firstly, a blue screen plate of the horse and jumping rider was shot. Next, a background shot of the forest environment, with the tree of the dead and actors standing close to where the horse emerges. Finally, a shot of the fog and leaves being disturbed creates the effect of the horseman jumping out of the tree. ILM didn’t have a bluescreen shot of an actual horse, so they had to create one in the computer, as well as the headless horseman, which are both digital elements. Similarly to the decapitation sequences, artists layered all of these separate plates on top of each other to form the singular shot making a scene that might have been unrealistic feel very believable instead. 

It has been 25 years since its initial theatrical release, and rewatching Sleepy Hollow you can witness firsthand how ILM’s work remains timeless and able to reach new generations. The eerie and suspenseful atmosphere that Sleepy Hollow pulls audiences in and stands as a formidable achievement of classic Hollywood filmmaking, adding another iconic cinematic monster with the headless horseman, who is equally as feared standing next to other horror icons such as the unnerving Count Dracula, and misunderstood Frankenstein monster. Washington Irving’s The Legend of Sleepy Hollow lives on through the visionary work of artists from each generation. For Burton’s retelling, ILM wielded the eerie magic that gave life to the undying legend of the headless horseman.

Adam Berry is the Studio Operations Manager for the ILM Vancouver studio. His passion for film led Adam to ILM in 2022, coming from an extensive career across different sectors of the hospitality industry including cruise ships, luxury hotels and resorts. If he’s not at the movies or traveling to new destinations, you can find Adam staying active and exploring Vancouver.

For Transformers one, the art department aimed to create shapes and silhouettes that appeared clean and simple from a distance, yet included intricate, purposeful details up-close, such as cut lines in the panels, smaller inset geometry, and layered panel work. Given that iconic Transformer designs often stem from their helmet shapes, the art department worked to seamlessly integrate faces into the helmets, enhancing their expressiveness while maintaining a mechanical, rigid aesthetic.

Check out the full design case study here: https://www.ilm.com/art-department/transformers-one-concept-art/

After 38 years, the veteran effects artist is retiring.

By Lucas O. Seastrom

First opening in 1987, the original Star Tours attraction at Disneyland included what was the most complex optical composite created at Industrial Light & Magic up to that time. A “view” out the window of a starspeeder was in fact a state-of-the-art flight simulator developed by Walt Disney Imagineering (WDI) and Rediffusion Simulation with miniature effects by ILM. Among the thrilling encounters for passengers onboard was a harrowing trip through a cluster of icy comets which the crew dubbed “ice-teroids.”

Compositing in this photochemical era involved a piece of equipment known as an optical printer. With iterations dating back to the earliest days of cinema, optical printers combined separately-photographed elements by recapturing them – one frame and one layer at a time – onto a new roll of film negative. Optical printers and the artists who operated them created the final effect one viewed onscreen with everything carefully (and painstakingly) blended together. Going back to Star Wars: A New Hope (1977), ILM had developed the most sophisticated compositing techniques yet seen, allowing for even greater refinement and finesse. 

The ice-teroid shot in Star Tours combined some 60 elements of individual sections of film. By comparison, the most complex shot of a space battle in Star Wars: Return of the Jedi (1983) just a few years earlier had little more than half the number. One of the two optical printer operators to work on the new shot for Star Tours was Jon Alexander, hired only that year in 1986. 

“Don Clark and I worked on the shot together on the Anderson Optical Printer,” Alexander tells ILM.com, “and once you started, you couldn’t stop. Once you started a shot all the motors warmed up and they needed to stay on. If they were turned off you risked the machine cooling down and settling into a misalignment of the earlier passes. It took 24 hours to make all the elements so we split 12 hour shifts. 

“The Anderson was an old-style optical printer,” Alexander continues, “where if you wanted to add any movement to the shot you had to crank little knobs by hand with an accuracy at best of a couple hundreds of an inch. Some years later ILM acquired the MC [motion-control] printer which was accurate to within a couple ten-thousandths of an inch, which is crazy. It’s like throwing a baseball from here in San Francisco and hitting the Empire State Building in New York.” 

38 years later, Jon Alexander has now decided to retire, and ILM is celebrating his storied tenure with the company that stretches over dozens of films, series, attractions, and special venue projects – not to mention quite a lot of technological change.

Back in the late 1970s, Alexander had what he calls “a wandering college career” while studying at Ohio State University. With a background in both engineering and cinematography, he arrived in Southern California in 1980 to work at Calico Creations, an active commercial house. There, Alexander gained experience with motion-control camera systems, innovative tools that combined the latest computer technology with mechanical engineering. “This was before personal computers were readily available,” he notes. “We were doing programming with machine tools to create motion graphics for around 50 commercials a year. Everyone wanted something like 2001: A Space Odyssey [1968], the slit-scan style. It was a very manual process.”

These tools were used for everything from photographing miniatures to shooting hand-drawn cels on an animation stand. In conjunction with the team at Calico was Bill Tondreau, an accomplished engineer who designed his own motion-control systems, which Alexander learned to use. 

“The system that ILM used on Star Wars was very analog,” Alexander explains. “You could speed up or slow down, but it was very hard to hit specific points. It was an art for those guys to get used to. They were flying in space so it didn’t have to be as precise. They got really good at it, but it wasn’t as adaptable as what Bill Tondreau later developed, which used stepper motors. ILM was switching over to this style, and my colleague Rob Burton at Calico was hired by ILM for Howard the Duck [1986], and they had so much work that they needed more operators. They had to be Tondreau-system operators, so they recommended me. They were looking for someone to do this specific thing, and hired me for three months. I have milked that for 38 years.”

Alexander works at an animation stand on one of his early ILM productions, The Witches of Eastwick (1987).

Initially, Alexander worked in the animation department, photographing cels with a down-shooting system. Among these projects was The Witches of Eastwick (1987), for which Alexander shot a tennis ball as a lone element for a scene involving a doubles match. This was required because, as Alexander recalls it, only the actress Cher knew how to play tennis, so the cast mimed the game without a ball. 

“It was while working on this camera for Eastwick that I met Michael Jackson,” Alexander says. “I was working late and no one else was in the back of D Building [at ILM’s Kerner facility]. I was leaning over and adjusting the tennis ball when I got this feeling someone was right behind me. I turned my head and he was about three feet away with two of the biggest men – security guards – I’d ever seen. [Producer] Patty Blau popped around the guards and said, ‘Hi, this is Michael. He was wondering what you were doing.’ This was around the time ILM was finishing up [Disneyland attraction] Captain EO.”

Alexander’s technical experience once again necessitated a move, this time to the optical department, where a new optical printer was being refined. The aforementioned “MC,” or motion-control printer had been developed by Los Angeles-based Mechanical Concepts as a first of its kind device. 

“It was a motion-control printer, but when it got here, it didn’t work,” Alexander explains. “Everything was project to project in those days, but optical was always going and it looked like they needed more folks. When I heard about this new printer, I went up to Kenneth Smith, who was running optical at the time, and explained that I could put a Tondreau system on it. I had done some optical work in L.A., so it wasn’t entirely foreign, but ILM was off the charts in terms of the people and equipment they had.”

Alexander collaborated with machinist Udo Pampel to reconfigure the MC printer to run on the system. The result was arguably ILM’s most sophisticated optical printer that allowed artists to create not only incredibly precise composites, but recreate shots entirely by adding movements or zooms. An early assignment for the Academy Award-winning Innerspace (1987) required Alexander to simulate the bouncing undulations of the camera “inside” the body of actor Martin Short.

“They were cutting back and forth between Martin Short running and this smooth motion-control inside the body, and [visual effects supervisor] Dennis [Muren] thought it looked weird,” Alexander says. “But at that point they couldn’t go back out on stage and reshoot everything. Dennis asked me if I could do something that had the same up and down motion of running. It was a tough thing to do on the stage, but it wasn’t particularly tough on the MC Printer because I could project onto the wall, track something specific like a button at the center of his chest, which then provided a curve like someone running along. So when I did the composite, it matched up. It was no problem to do that because of the way the printer was set up. I used to do a lot of that kind of match-moving stuff to project onto the wall and track something in a minute way. That’s entry-level now, but to do that in post at the time was almost impossible because there were so few motion-control printers around. We had one of the first.”

Alexander at work on the motion-control or “MC” optical printer.

As Alexander notes, for a handful of years, his position was among the most significant in ILM’s pipeline, considering that most everything had to be funneled through the MC printer. “Looking back at these things, it wasn’t a big deal to accomplish,” he admits. “It was just that people hadn’t done it before. Supervisors like Dennis or Ken Ralston could expand what they wanted to do creatively, and people like me were a great set of hands to help them.”

Change was in the air, however, and computer graphics (CG) effects were steadily on the rise. At a time when many traditional artists and technicians were making decisions about whether to embrace the change, Alexander lept in headfirst. “At that time, there were no BFA’s in computer graphics,” he explains. “You had to come out of an engineering school just to do anything. It fostered this new kind of collaboration. We on the film side knew what the final product had to look like and the programmers knew the math and physics to make it possible.”

Alexander remains very matter-of-fact about the transition. “CG helped eliminate the painful aspects of working on film. You’d work for hours on something, moving and adjusting things. It was so choreographed that you had to put the filters in the exact same order each time to get the same result. Then after you shot it, you’d go to the dark room, turn the lights out, unload the magazine and put the film in a can, and then you’d turn the lights on and realize you’d forgotten to close the can…and what you just shot was gone. In CG, if you make a mistake, you press ‘Undo.’”

Among Alexander’s first CG projects were Fire in the Sky (1993), The Flintstones (1994), and Forrest Gump (1994). A personal standout shot came in 1998’s Meet Joe Black when he had to help create the shocking death of actor Brad Pitt’s character, a young man who is hit by two cars while crossing a street. “They shot the different elements with bluescreen,” Alexander says. “The cars came in slow because it was too dangerous to go fast and I timed everything to match it all together. The director [Martin Brest] asked to make him flip in the air, which I then did.” A compositing supervisor at that stage, he enjoyed the opportunity to “test things and try out ideas,” from large elements to minuscule details. 

Alexander at work on a digital composite for Star Wars: The Phantom Menace (1999).

Alexander’s last major shift came around 2008 when visual effects supervisor Bill George organized a unit to assist WDI with a reimagined Star Tours, ultimately opening in 2011 with 3D digital imagery. Eventually, Alexander stayed with George’s rides unit full-time, contributing to everything from Disney’s Soarin’ Over the World and Star Wars: Rise of the Resistance to Universal’s Race Through New York Starring Jimmy Fallon. In every case, he was able to work in a diverse array of image and presentation formats. More recently, Alexander has contributed to special venue projects for the Sphere in Las Vegas, including Dead & Co.’s Dead Forever concert series (2024) and Darren Aronofsky’s Postcard from Earth (2023).

“The Sphere is like being in a VR headset but massive,” Alexander says. “Something like 17,000 people can interact with the screen at one time. I was talking with Darren Aronofsky about how it opens up the possibilities about how to tell a story. You no longer set the direction for people to look. Something could be going on in one area, and then you put something up in another area. Maybe some people notice and others don’t. It’s a different way of thinking about it, like in a game, where you influence the way the story goes. To me, it’s really cool to move into this new space where you’re not limited by being in a movie theater where you can only look in a certain direction.”

As his ILM journey comes to a close, it’s poignant to consider that Star Tours in particular has formed bookends to the many productions Alexander has been involved with. In fact, he and Imagineer Tom Fitzgerald are the only two people to have worked on every iteration of Star Tours to date. Just recently, Alexander spent six months with WDI to help oversee the installations of the ride’s latest update in Disney Parks in California, Florida, and France. With characteristic humility, he’s keen to point out that he made a small mistake way back on that fabled ice-teroid shot in the original 1987 version. A matte for one of the dozens of ice-teroids was slightly misaligned, a detail too small for most viewers to even notice, but something that Alexander’s children would never fail to mention, much to his own amusement.

Alexander at work in 3D for an update to the Disney Parks attraction, Star Tours.

“I came into this with different expectations, like we all do,” Alexander reflects. “You think they’ll write a book about you one day. No one’s going to write a book about me. Then you think, maybe I’ll get a chapter in the book. But most of us just become footnotes. We’re part of a team. My dad and my uncles were all sergeants in the military. I got an appointment to the Air Force Academy. When I went there for induction a just-graduated 2nd Lieutenant was showing us around, and the Master Sergeant came by, an older guy with the stripes on his arm, and gave a crisp salute to this new 2nd Lieutenant as he walked by.

“The Lieutenant said, ‘There’s a lesson for you,’” Alexander continues. “‘This guy has to salute me because I’m his superior officer, but he’s a sergeant and he does everything. I can’t do anything that he does. He organizes all of the enlisted men to do what we need, so I have to listen to him and trust him to get it done.’ I kind of feel like I’m a Master Sergeant. I’m fortunate enough to have gotten to the point where I’m involved at this level, and I feel like there’s not a shot that I can’t fix. It’s not just me; it’s my position. That’s what a compositing supervisor is supposed to do. If there’s a shot with a problem, and you can’t go back and change anything, yes I can fix it for you. I find that particularly gratifying. I’ve stayed at this level in part because it’s about life-balance. If I were to go higher, I’d be away for four months at a time, and I didn’t want to do that to my family. I’ve got like five Oscars on the family side of stuff.

“George Lucas chose people really well, and those people chose their hires really well,” Alexander concludes. “George trusted people like Dennis Muren to get anything done for him, and Dennis trusted people like me to get him whatever he needed. George and Dennis and those types of people were magnanimous enough to let people like me in the room. Because of that, I’ve tried to share as much as I can when new folks come in so they feel like they’re part of it. To me that’s the most important thing, making people feel like they’re part of a team. The beauty of this place has been how collaborative it is.”

Lucas O. Seastrom is a writer and historian at Lucasfilm.

Exploring the technical innovations and behind-the-scenes stories that brought Slimer to life in Ghostbusters 2, reaching new heights in animatronics and practical effects at Industrial Light & Magic.

By Jamie Benning

The original Slimer head on display at Lucasfilm headquarters in San Francisco.

When Ghostbusters (1984) premiered, it became an instant classic. With a star-studded cast—Bill Murray, Dan Aykroyd, Harold Ramis, and Ernie Hudson—alongside Sigourney Weaver, Rick Moranis, and Annie Potts, the film combined supernatural elements with groundbreaking visual effects and perfect comedic timing, captivating audiences worldwide. The film grossed $282 million in its initial theatrical run, cementing its place in film history.

Beyond the popular human cast, one standout element was the ghost originally named “Onionhead.” Special effects artist Steve Johnson, credited as the sculptor, likely drew inspiration for the name from its vegetable-like appearance. This gluttonous green ghost, classified as a Class 5 Full-Roaming Vapor, made a brief but memorable appearance that delighted audiences. Designed to be grotesque and chaotic, Onionhead unexpectedly became a fan favorite. Bill Murray’s famous line, “He slimed me!” as Peter Venkman, became one of the most quoted phrases of 1984. Onionhead’s popularity only grew with The Real Ghostbusters (1986) animated series, where he was reimagined as a mischievous yet lovable, pet-like character.

In the eleventh episode of The Real Ghostbusters, titled “Citizen Ghost,” which first aired in November 1986, Onionhead finally got his new name. The episode’s flashback shows how the Ghostbusters became friends with the little green ghost, with Ray Stantz giving him the fitting nickname “Slimer.” The name endured, becoming a permanent fixture in all subsequent Ghostbusters projects.

Ghostbusters 2 (1989) sought to bring the evolved version of Slimer to the big screen, balancing the charm of the original character with the expectations of younger fans familiar with the cartoon. With the baton passed from Boss Film Studios to ILM for the sequel, visual effects supervisor Dennis Muren described the task ahead of them to Cinefex. “We had the opportunity to create a whole new array of ghostly images,” he explained, using all the tools at their disposal. An early idea of using a rod puppet was quickly dismissed, with Muren preferring to opt for a fresh take on the “man in a suit” approach.

With the technological advances made in the five years since the original film, the goal was not just to capture the original magic but to push the boundaries of animatronics, puppetry, and practical effects. 

Slimer concept art for Ghostbusters 2 by Henry Mayo. (Credit: Columbia/Sony & ILM)

Reimagining Slimer

For the sequel, Slimer needed to embody the more playful, cartoonish persona. “[Executive Producer] Michael [C. Gross] wanted elements from the cartoon version incorporated as well, and to this end he had Thom [Enriquiez] do the new series of drawings – which were fabulous,” creature and makeup designer Tim Lawrence explained to Cinefex.  

Mark Siegel, a key contributor to Slimer’s original creation, was brought to ILM for Ghostbusters 2 to resculpt the character and adapt him for the film’s lighter tone. Siegel had been deeply involved in the creation of the original Onionhead ghost, sculpting his teeth, tongue, and inner mouth, as well as the complete replacement head for a second puppet with a wider, more frightened look to the mouth. He also puppeteered the tongue and eyebrows for the majority of the shots. 

The sculpting process for the design maquette was a collaborative effort, with Siegel primarily handling Slimer’s body, head, and face while fellow crew member and performer Howie Weed focused on the arms and hands. For the full-sized puppet Siegel sculpted the head and the arms.

The character of Onionhead in Ghostbusters wasn’t just an arbitrary creation. His mannerisms and chaotic energy were directly inspired by the late John Belushi, specifically his portrayal of Bluto in Animal House (1978). This connection was not merely symbolic; it was a tangible part of his design and performance. Mark Siegel recalls how Harold Ramis and Dan Aykroyd made it clear to the team that Slimer was a representation of their close friend Belushi’s comedic spirit.

The team didn’t just envision this; they meticulously pored over Belushi’s scenes. “We studied frame by frame old VHS tapes of Belushi’s Animal House scenes, focusing on his expressions,” Siegel elaborates. This analysis allowed them to incorporate Belushi’s signature movements and broad, exaggerated physicality into Slimer’s performance for the first film. According to Siegel, it was Belushi’s expressive style that truly captured the blend of charm and grotesqueness that defined Slimer’s character.

“When I first started sculpting the new Slimer, I thought, ‘Well, that’s cute,'” Siegel admits. But the evolution of the character, from disgusting blob to family-friendly ghost, presented some challenges. “I felt we were losing some of the raw, chaotic energy that made Slimer memorable in the first film.”

The sculpting process was a collaborative effort, with Siegel primarily handling Slimer’s body, head and face while fellow crew member and performer Howie Weed focused on the arms and hands. 

Scenes originally envisioned for Slimer in Ghostbusters 2 included him eating various types of food around the station house while Louis (Rick Moranis) tried in vain to catch him. Then later, when Louis straps on a backpack and tries to help the Ghostbusters, he finds Slimer driving a bus. Louis hitches a ride and the two eventually become friends. An early storyboard also shows Slimer flying around the Statue of Liberty for the final shot of the movie, mirroring the first film’s finale. But, as is often the case in artistic pursuits, things were adapted, changed, and even removed along the way, all for a multitude of reasons.


Technical Innovations: Pioneering Animatronics

One of the key advancements in Ghostbusters 2 was the shift from manual cable-controlled puppetry to the use of radio-controlled servos for Slimer’s facial expressions. Al Coulter, an ILM animatronics engineer, led the effort to remotely automate Slimer’s face, allowing for more nuanced performances. “Al wanted to mechanize Slimer’s expressions,” Siegel says. “The SNARK system (reported as both Serial Networked Actuator Relay Kit and Synthetic Neuro-Animation Repeating Kinetics module) allowed us to control multiple servos simultaneously, meaning that expressions could be achieved more easily, with fewer people.” This system was a technological leap forward, offering new possibilities for nuanced expressions, though it brought its own set of challenges.

One of the key motivations for this advancement was to streamline post-production, which had been a challenge in the original film. “In Ghostbusters, we had to deal with puppeteers in the frame, which meant removing them during post-production,” Siegel recalls. That was both time consuming and costly.

Coulter notes that while the servos were originally designed for consumer RC airplanes, significant customization was required to make them work for Slimer’s facial movements. “The joystick stuff from Hobby World caused a lot of problems when we went onstage, because there was so much interference from all the lights and the wires and the machinery…that we needed to be able to connect our character to something direct, hardwired. So we had this guy build control boards which we bundled together and plugged into a PC. And that PC would then have software on it, custom again, and it would record our performance.” It was a major advancement for the time, in a way following in the footsteps of the leaps ILM had made in motion-control in the mid 1970s for the spaceships in George Lucas’ Star Wars: A New Hope (1977). 

Coulter reflects that working with the technology of the time, particularly the slow computing speeds, was a challenge in itself. “We were working with computers that ran at 24 MHz—slow by today’s standards, but cutting edge at the time.” Despite this, the SNARK system was a pioneering achievement in real-time, computer-controlled puppetry, allowing for repeatable and detailed performances. “The facial expression, eyebrows, eyes, I think we had a nose wiggle…being updated to radio control servos, that was a great idea,” Siegel adds.

Behind-the-scenes videos posted by William Forsche (another crew member) show the incredible range of facial contortions that could be achieved with the new Slimer, from sad to happy to curious in a matter of seconds. While motion-control’s precision was essential for the spaceships in Star Wars, it wasn’t yet clear how well the recording and playback of Slimer’s facial movements would work.

Ultimately the servos introduced their own set of challenges. While the system allowed for greater control, it limited some of the more exaggerated movements that defined the original Slimer. As Siegel explains, “In the first film, Slimer’s jaw was controlled manually, allowing for more exaggerated, chaotic movements. The sculpture was extremely soft and flexible. There was no structure in the lower jaw at all. Just a little metal rod in the lower lip and a puppeteer down below could pull it, just stretch that rubber way wide, twist it from side to side and get a whole variety of expressions, make him chew and stuff…. While the servos and pneumatics we used in Ghostbusters 2 gave us more precise control over the facial expressions, they also introduced limitations in terms of flexibility and range.”

The head wasn’t the only challenge. In trying to replicate the exaggerated, cartoon-like appearance and movements of Slimer’s body from the animated series, the crew encountered more hurdles.

The original Slimer head on display at Lucasfilm headquarters in San Francisco.

Innovating Slimer’s Body Design

While the animatronics used for Slimer’s facial expressions were groundbreaking, if beginning to become troublesome, the team also had to experiment with new ways to animate Slimer’s body. Tim Lawrence proposed constructing the body out of spandex with bean-bag-like filling, aiming to give Slimer a more fluid, exaggerated range of motion similar to the stretch-and-squash effect seen in his cartoon form.

However, this idea quickly ran into practical issues, as Siegel explains. “It might have been a couple of days before we were shooting and Dennis Muren came in and looked at the whole puppet assembled, and he wisely said well that spandex is going to look entirely different on camera than that rubber head. For some reason that had never occurred to anyone before. So in a mad rush we took that spandex bean bag body into our spray ventilation booth, and I had to mix up big batches of foam latex and we actually spatulated it onto that entire body. And that’s really hard to do because the foam latex has a limited time before it sets. And then it had to be baked in an oven. So it was thrown together at short notice in less than one day. When the rubber was cured over the bean bag it made the body a lot less stretchy and flexible than Tim had intended it to be.” The problems were beginning to mount.

Robin Shelby tests the Slimer body costume. (Credit: Columbia/Sony & ILM)

Robin Shelby: The Heart Inside Slimer

While ILM envisioned the technological advancements to play a key role in bringing the reimagined Slimer to life in Ghostbusters 2, it was Robin Shelby (then Robin Navlyt), the performer inside the suit, who truly embodied the character’s spirit.

Previously known by ILM for her role as a troll in Ron Howard’s Willow (1988), she took over the role of Slimer for Ghostbusters 2 after the original actor, Bobby Porter, became unavailable. As Shelby recalls, “They had someone cast, then they wrote Slimer out of the script…and then they wrote him back in, but the original actor had taken on another project.” At just 20 years old, Shelby was tasked with bringing a new version of Slimer to life, despite the suit’s heavy and cumbersome design. But she was up for the challenge!

“I grew up doing musical theater, a lot of dance. So I was very aware of my body…and that helped a lot. I didn’t have any stunt experience at the time, but a lot of movement and dance experience,” remarks Shelby. Reflecting on her first impression of the suit, she adds, “They were still building it when I came in. It wasn’t all painted and set. They had to do a cast of my face and head so they could fit it to me. But when I first saw it with the motors, it was a little scary. The weight was extraordinary. But, the crew was amazing.”

With Shelby performing inside the suit and the expressions operated remotely, production became more efficient. By using a bluescreen and having Shelby wear a black leotard, ILM eliminated the need for puppeteer removal in post-production, just as originally planned.

Shelby and the team had about five to six weeks of rehearsal to help her adjust to the suit and coordinate with the puppeteers operating the animatronic features. The suit itself came in three interlocking segments: the main body, the gloves for the hands and arms, and the head. “I couldn’t see anything really. So what we would do is rehearse, they would shoot it, and then they would have me watch it. So I could see what it was all looking like. So, I knew in my head what we were all doing,” Shelby explains.

The physical demands of the suit were intense. Al Coulter praises her resilience, noting that the weight of the suit left marks on her nose: “As soon as you said action, she was right back there, just banging it out every time. Amazing!”

“The suit was probably over a third of my own body weight,” Shelby recalls. “I probably weighed like 95 pounds when we shot that, and it was probably 35 pounds. People ask, was it hot? It was hot, but probably the worst part of it was the weight.” 

Michael C. Gross, the executive producer, visited the set to see how Shelby’s performance was going. “He said, ‘Don’t be the dancer that I know you are, just get in there and be gritty and be mean. Just go out and have fun.’ So I was just trying to rough it up a lot on the set, make it not so dainty or perfect or dance-like, just to try and make it work for the character. It was so much fun, and they really allowed me to play with it,” Shelby enthuses.

Still, even enthusiasm has its limits. “We’d worked for about an hour, and they’d say, okay, we’re gonna take a break. They’d take the head off. I wouldn’t get out of the costume, but they’d take the head off so I could have water, get some air, and sit down. There was a time that I pushed it because we were in the middle of the scene and I didn’t want to stop. They’re like, ‘Are you okay? You’re alright?’ I’d say, ‘Yeah, yeah, let’s just keep shooting. We’re almost done.’ And then Tim is directing me, ‘Okay, Robin, we need you to turn around and go left. Robin? Robin!!’ And I wasn’t even answering. ‘Get her out,’ they shouted.

“You try to be the trooper…when you’re new and just want to please everybody. But lesson learned, yeah, absolutely,” Shelby admitted. “But I’d do it all again,” she adds.

Despite the technical challenges and physical demands, there were plenty of lighthearted moments on set as well.


Bill Murray’s Antics

Bill Murray was known to be an unpredictable presence on set, providing some much-needed levity during the intense production process. One day, he arrived at the effects shop. “I didn’t realize how tall Bill Murray is (he’s 6’ 2”),” says Siegel. “And he was messing around with Robin, who’s tiny (4′ 11″), and he was picking her up like a child, and dancing around with her. He was hilarious.” For Shelby it was a surreal moment, “I was a big Saturday Night Live fan, I still am. And so he was one of my heroes at the time…so it was pretty amazing…. He asked if he could pick me up. And he picked me up over his head…. He was actually very sweet to me. You just never knew what he was going to do next.”

Effort vs. Outcome

With the crew rehearsals helping them find the limitations of the suit and the animatronic head, they began to hone the performance with some impressive results.

As Tim Lawrence told Cinefex, “Once we saw the subtlety of the expression that was possible, Slimer suddenly had an incredible life to him that I had never seen in such a character before. To see his face light up from very sad to very happy was a wonderful thing. The scene I was most happy with was one that they just threw at us. I wasn’t sure we could even do it because it was a 30-second shot without a cutaway. In it, Louis gets off the bus and heads off down the sidewalk. At this point, Slimer and he are on friendlier terms. Suddenly, Slimer enters frame, rushes intently up to Louis and pats him on the shoulder. From his motions, it is obvious he wants to go with Louis really badly, but Louis tells him he can’t and Slimer gets all sad. Then Louis tells him something that makes him happy, and Slimer gives Louis a big wet kiss with his tongue coming out and licking him. Then he does a spin and flies off. Well, we did that all in one cut and it looked wonderful. I had never seen a rubber character do what Slimer had done.”

“For that scene, they gave me a tape of it because it was shot in New York. And I had to listen to the dialogue so I could know the exact timing. I had probably listened to that hundreds of times just to get Rick Moranis’ dialogue and timing,” explains Shelby.

“Michael just flipped – he thought the performance was excellent. But at the same time, he told us that they might not be able to use the shot – and ultimately it did not make it into the film,” Lawrence had noted.

Despite completing all of the storyboarded shots, Slimer’s role in the final cut of the film was indeed scaled back considerably. Gross again explained in Cinefex: “Whenever he was in there, it seemed like he was really an intrusion. At first we thought the answer was to add more of him, so we had an ongoing confrontation between Louis and Slimer in which Louis was constantly trying to catch him. We thought it would be funny and at screenings we expected the audience to cheer and laugh when they saw him again. But nothing. No reaction. The audience was looking at it as a fresh movie. There were a lot of kids who loved to see him, so we knew we could not abandon him completely, but he never really worked with the audience the way we expected. Ultimately we decided less was better, and in the final film we limited him to two very quick shots.”

Siegel takes a philosophical approach, “From my own experience working in the business as long as I have, I just assume that some of the work’s gonna be cut…. His presence in the movie was questionable from the beginning. So again, I wasn’t surprised if some of his shots were removed.”

The disappointment is palpable for Shelby. “I think that’s probably the most bummed out I was…. Everybody just did such a great job on putting that all together.” But for the 20-year-old, little did she know that one day she’d get a call from Paul Feig to reprise the role in the 2016 reboot Ghostbusters: Answer the Call, this time providing the voice for “Lady Slimer.”

The original Slimer head on display at Lucasfilm headquarters in San Francisco.

A Legacy of Experimentation

The experience of working on Ghostbusters 2, was always about the spirit of experimentation. Slimer’s evolution from the chaotic “ugly little spud” in the original Ghostbusters to a more cartoonish, mechanized character in Ghostbusters 2 stands as a testament to ILM’s relentless pursuit for innovation. Despite the technological limitations of the time, Slimer’s creation helped pave the way for future advancements in animatronics and practical effects. As Siegel concludes, “Every project has its challenges, but the lessons you learn set the stage for the next big breakthrough.”

While ILM pushed the envelope with cutting-edge animatronics, the process also highlighted the enduring importance of human performers. As Coulter reflects, “We overreached a bit. The software itself was very rudimentary. Everything was so experimental back then.” He highlighted that, despite the ability to program precise facial movements, human performers remained more adaptable and agile in responding to the creative needs of a scene. “At one point they brought the director and he looked at it and kind of went, ‘Could you make him incredulous at this one point?’ Er…. We don’t have an incredulous button here. It’s like turn the computer off, bring the puppeteers back in, and off we go again. A computer is not going to have any idea how to convey anger or emotion,” Coulter remarks, noting that even today, animators still rely heavily on human actors for motion capture, using them as the source for animation.

An Ongoing Partnership Between Practical and Digital

ILM’s current director of research and development Cary Phillips explains that physical puppets still hold a vital role in modern productions. “We often get called on to build digital models of physical puppets that perform on set, to execute performances that the physical models can’t. Grogu [from The Mandalorian] is a recent example. Physical models are an inspiration for the actors and everyone on set, as well as for animators who bring the digital version to life.”

He adds that some directors also prefer digital puppets that retain the movement style of their physical counterparts. “I think our human eyes are attuned to certain qualities of movement that we find appealing and comfortable because they suggest a physical medium at work. But that’s done by hand; there’s usually no automatic connection between the physical model and the digital.”

The challenge remains how to make a puppet, digital or physical, feel alive. “A frequent criticism of computer animation, sometimes legit and sometimes not, is that it can look too polished and smooth,” says Phillips, “lacking the spontaneity of a live performance, the unintentional quirks that make a character seem alive. Great animators can create this, but it’s hard. That’s one of the lasting appeals of motion capture, although it also introduces an entirely new set of technical challenges and limitations. Ideally, capture devices are simply an alternative to the keyboard and mouse as a way of describing movement, for use when appropriate.”

Phillips further reflects on the legacy of those who came before him and the evolving boundaries, or lack thereof, in modern visual effects. “Discovery is a vital part of the creative process. Something might feel like a mistake while it’s happening but turn out afterwards to have an appealing quality. The best tools let artists experiment quickly and work iteratively. One of the benefits of a computer graphics model is that it can do things that a physical model can’t, and we often get asked to make models and characters move in ways that violate the laws of physics. Leap tall buildings in a single bound. Cheat to get the action in the frame. So, there are no absolute boundaries—you can make it do anything. Even move in a way that would rip a real person apart. It’s an awesome power, but it takes real artistry to keep it looking plausible and appealing, even if it doesn’t look technically ‘real.’”

At Lucasfilm and ILM’s headquarters at the Presidio in San Francisco are halls lined with artifacts from the company’s rich history—matte paintings, spaceship models, and optical effects equipment. And around one corner, encased in acrylic, lies Slimer from Ghostbusters 2. His still vibrant green latex skin, now shrunken with age, reveals the servos and pneumatic cylinders beneath. It serves as a poignant reminder to all who pass by that character animation has deep roots in the physical world.

Jamie Benning is a filmmaker, author, podcaster and life-long fan of sci-fi and fantasy movies. Visit Filmumentaries.com and listen to The Filmumentaries podcast for twice-monthly interviews with behind the scenes artists. Find Jamie on X @jamieswb and as @filmumentaries on Threads, Instagram and Facebook.

Make room for some stellar content and southern hospitality from Industrial Light & Magic in the heart of Texas at SXSW 2025! Find us on the schedule from March 7-15, where we are teaming up with the minds behind ABBA Voyage, alongside the Dead & Company and U2:UV’s Las Vegas Sphere experiences. We’ll explore how performing artists can leverage cinematic and filmed entertainment to drive forward the artform and meet evolving audience expectations. You’ll learn about the creative and practical challenges posed by different physical spaces as well as the crucial role of cross-functional team collaboration in crafting extraordinary communal experiences. It’s an opportunity for music fans to ask questions about the future of live entertainment.

This morning the Hollywood Professional Association unveiled the HPA Award Creative Category nominees and ILM received seven nominations across three categories. In the Outstanding Visual Effects – Live Action Feature category, ILM received four of the five nominations.

For the Live Action Feature category, ILM was nominated for Alien: Romulus (Nelson Sepulveda-Fauser, Ale’ Melendez, Sebastian Ravagnani, Nicolas Caillier, Steven Denyer), The Creator (James Clyne, Trevor Hazel, Keith Anthony-Brown, Danielle Legovich, David Dally), Deadpool & Wolverine (Vincent Papaix, Georg Kaltenbrunner, Alexander Poei, Ziad Shureih, Russell Lum), and A Quiet Place: Day One (Malcolm Humphreys, Jordan Harding, Charmaine Chan, Michael Lum, Steve Hardy).

For Outstanding Visual Effects – Animated Feature, ILM received a nomination for Ultraman: Rising (Hayden Jones, Stefan Drury, Sean M. Murphy, Mathieu Vig, Kyle Winkelman). This nomination was one of just two in this category; our friends at Pixar received the other nomination for Inside Out 2.

In the Outstanding Visual Effects – Live Action Episode or Series Season category, ILM received two of the five nominations for: Loki – Season 2 (Steve Moncur, Christian Waite, Jeremy Sawyer, Ben Aickin, Pieter Warmington) and Percy Jackson and the Olympians – Season 1 (Erik Henry, Matt Robken, Jeff White, Jose Burgos, Donny Rausch).

The HPA Awards Gala will take place on November 7, 2024, at the Television Academy’s Wolf Theater in Hollywood.

Take a deep dive into the history and lore behind the starship designs created by ILM and introduced 40 years ago in The Search for Spock.

By Jay Stobie

Written and produced by Harve Bennett, Star Trek III: The Search for Spock (1984) afforded actor Leonard Nimoy his first opportunity to direct a Star Trek feature. With Ken Ralston as visual effects supervisor, the film also supplied Industrial Light & Magic with the chance to leave its own indelible legacy on the Star Trek franchise. ILM’s work on Star Trek II: The Wrath of Khan (1982) had included a collaboration with the Lucasfilm Computer Division which yielded the first all-CG sequence in a feature film, yet the company had an even greater impact on the film series’ third installment.

Among its many contributions to Star Trek III, ILM tackled the monumental task of designing and building five major starship and space station models that were introduced in the film. Though crafted specifically for this project, those steadfast exterior designs became staples in the Star Trek universe and appeared in prominent scenes across numerous films and television series. As we celebrate The Search for Spock’s 40th anniversary, let’s examine the long-lasting nature of ILM’s iconic creations and explore the circumstances in which they were employed in later Star Trek productions.

The Merchantman starship flies through space in Star Trek III.
The Merchantman in Star Trek III: The Search for Spock. (Credit: Paramount Pictures)

The Merchantman: A Criminal Craft

A small, boxy vessel with a curved forward section lurked in deep space during the first act of Star Trek III, referred to as a merchantman by the film’s script. The ship carried a Klingon passenger (Cathie Shirriff) who had purchased intelligence related to the terraforming device known as Genesis. A much larger Klingon ship (more on that in a moment) lowered its cloaking device, becoming visible long enough to receive the data. Unfortunately, the Klingon operative had glanced at the information, prompting the vessel to swoop around and obliterate the merchantman with its weaponry.

From the earliest stages of pre-production on Star Trek III, the team at ILM — including Ralston, visual effects art directors Nilo Rodis and David Carson, supervising modelmaker Steve Gawley, and modelmaker Bill George — presented their creations to Nimoy and Bennett, who suggested alterations before final approval. Rodis and Carson generated concepts, while Gawley and George offered input and spearheaded model construction. The meticulous process was adaptable to each model’s role in the script, as the merchantman’s brief appearance meant it was fabricated in a relatively short amount of time. “The merchant ship was a design we threw together in a couple of weeks from a bunch of model parts,” visual effects cameraman Donald Dow told writer Brad Munson in Cinefex. “It was going to be blown up right at the very start, so there was no sense putting a lot of time into it.”

Camera operator Selwyn Eddy photographs the Merchantman miniature using ILM’s “Rama” motion-control camera.
Camera operator Selwyn Eddy photographs the Merchantman miniature using ILM’s “Rama” motion-control camera. (Credit: Industrial Light & Magic)

Yet, for a vessel not expected to see much screen time, the merchantman ultimately proved to be a testament to ILM’s dedication to quality, as the ship fulfilled its purpose in the film and went on to experience a revitalized livelihood in future productions. Boasting slight modifications in each instance, the merchantman reappeared as different vessels on six occasions. From a Sheliak transport carrying colonists in Star Trek: The Next Generation’s (1987) “The Ensigns of Command” to a Cardassian freighter targeted by saboteurs in Star Trek: Deep Space Nine’s (1993) “The Maquis, Part I,” the merchantman turned into a reliable resource for both series, as well as for Star Trek: Voyager (1995). In an intriguing twist, the merchantman — best known for being destroyed by a Klingon Bird-of-Prey in The Search for Spock — was even reconfigured to become a Klingon vessel in Deep Space Nine’s “Rules of Engagement.”

The Klingon Bird-of-Prey in Star Trek III: The Search for Spock.
The Klingon Bird-of-Prey in Star Trek III: The Search for Spock. (Credit: Paramount Pictures)

The Klingon Bird-of-Prey: A Fearsome Fighter

An imposing warship with a head-like bridge section and angled wings, the Klingon Bird-of-Prey easily outmatched the merchantman. Commanded by a Klingon named Kruge (Christopher Lloyd), the Bird-of-Prey was armed with a cloaking device that concealed it from its enemy’s scanners. Kruge sought the power of the Genesis device, traveling to the Genesis Planet and making quick work of the U.S.S. Grissom. Despite its swift victories over lesser foes, the Bird-of-Prey soon found itself squared off against the legendary U.S.S. Enterprise. Of course, unbeknownst to Kruge, James T. Kirk’s famed vessel had been severely damaged in Star Trek II and only maintained a skeleton crew on its bridge.

Modelmaker Bill George at work on the Bird-of-Prey miniature.
Modelmaker Bill George at work on the Bird-of-Prey miniature. (Credit: Industrial Light & Magic)

Perhaps the most distinctive starship ILM assembled for Star Trek III, the Klingon Bird-of-Prey model featured an intimidating green color scheme and motorized wings that could be raised above its primary hull. On top of bringing the vessel’s exterior to life, ILM pioneered the visual effect that permitted the Bird-of-Prey to decloak and become visible. “[Optical photography supervisor] Ken Smith came up with the optical effect,” Ralston shared with Nora Lee in American Cinematographer. “By using a ripple glass he threw the color sync off on each separation, so that everything is just a little out of whack. Then it all gets in sync and forms the ship.” The design impressed creatives to such a degree that, following the U.S.S. Enterprise’s destruction (yet another visual effect executed by ILM) in The Search for Spock, Kruge’s captured Bird-of-Prey — playfully renamed the H.M.S. Bounty by Kirk’s defiant crew — inherited the role of hero ship in the film’s Nimoy-directed sequel, Star Trek IV: The Voyage Home (1986).

Camera operator Selwyn Eddy shoots the Bird-of-Prey miniature while camera operator Ray Gilberti looks on.
Camera operator Selwyn Eddy (right) shoots the Bird-of-Prey miniature while camera operator Ray Gilberti (left) looks on. (Credit: Industrial Light & Magic)

However, the Bird-of-Prey’s prolific career was only just beginning. The ship’s signature profile played key parts as other nefarious Klingon vessels across the next three Star Trek films — Star Trek V: The Final Frontier (1989), Star Trek VI: The Undiscovered Country (1991), and Star Trek Generations (1994) — and popped up in numerous The Next Generation, Deep Space Nine, and Voyager episodes. As with many starships that began as physical models, the Bird-of-Prey was ultimately supplemented with a CG build in the latter stages of Deep Space Nine’s seven-season run. The craft even ended up in animated configurations for Star Trek: Lower Decks (2020) and Star Trek: Prodigy (2021). Nevertheless, all the Bird-of-Prey models that followed were based on the look established by ILM’s initial build. Furthermore, the 22nd century iterations of the Bird-of-Prey and Klingon D5-class variants which debuted in Star Trek: Enterprise (2001), a prequel series set over 100 years before The Search for Spock, were tailored to reflect their lineage as in-universe predecessors to ILM’s original Bird-of-Prey from Star Trek III.

Earth Spacedock in Star Trek III: The Search for Spock.
Earth Spacedock in Star Trek III: The Search for Spock. (Credit: Paramount Pictures)

Earth Spacedock: A Safe Haven in Space

As the U.S.S. Enterprise glided through the solar system on its way to a much-deserved respite from action, it was greeted by the sight of Earth Spacedock. With a mushroom-shaped upper section atop a stem extending downward, the gargantuan space station permitted entire starships to enter its massive superstructure and dock at a central core complete with repair facilities. Abuzz with ships and various shuttles, the lively starbase watched over Earth and kept the Federation’s fleet ready to serve missions of exploration and defense.

ILM’s Spacedock assignment necessitated three separate builds; namely the station’s illuminated exterior, its cavernous interior docking bay, and an interior view through the windows of a small, lounge-type set. Approximately five feet tall and three-and-a-half feet in diameter, the exterior model relied on a complex lighting system, which Ralston described in American Cinematographer. “[The Spacedock exterior] had lights inside after the door opens up and running lights that go inside. Sometimes it is hard to sync up all those functions with the motion control system. But I think it worked nicely.”

The issue of conveying the sheer size of a docking area able to house a multitude of starships received ILM’s innovative attention and expertise. “We found that the interior demanded some degree of atmospheric haze, even though there probably wouldn’t be any in outer space. It just needed help to look slightly degraded — not so crisp and clean,” visual effects cameraman Scott Farrar shared in Cinefex. “We ended up using blue gels on the lights and shooting in smoke for the basic fill look. Then, when we went to the light passes, we used a diffusion filter.”

ILM modelmakers work on the lighting components of the Earth Spacedock miniature.
ILM modelmakers work on the lighting components of the Earth Spacedock miniature. (Credit: Industrial Light & Magic)

As timeless as Earth Spacedock’s inaugural performance turned out to be, the station’s unveiling soon led to its return to the big-screen. In addition to being featured in the three Star Trek films which followed immediately after The Search for Spock, Earth Spacedock appeared as several other Federation starbases — Starbase 74, Lya Station Alpha, Starbase 133, and Starbase 84 — in The Next Generation via the use of stock footage. A version of Earth Spacedock seemed to be in the midst of orbital construction in the Star Trek: Discovery (2017) episode “Will You Take My Hand?,” while the design was translated into animated form to represent Douglas Station in Lower Decks. According to in-universe lore, Earth Spacedock was retired from service and transported to Athan Prime, where it was last seen as the central hub of the Fleet Museum in Star Trek: Picard’s (2020) third season.

The U.S.S. Excelsior in Star Trek III: The Search for Spock.
The U.S.S. Excelsior in Star Trek III: The Search for Spock. (Credit: Paramount Pictures)

U.S.S. Excelsior: The Transwarp Testbed

Dubbed “The Great Experiment,” the U.S.S. Excelsior acted as a testbed for an advanced faster-than-light propulsion system known as the transwarp drive. The Excelsior was spotted while berthed in Earth Spacedock, though the starship soon found itself attempting to engage its experimental engines as it pursued Admiral Kirk’s unauthorized departure aboard the Enterprise. Unfortunately for the Excelsior, Montgomery Scott (James Doohan) — the Enterprise’s chief engineer — had sabotaged the transwarp system, causing the vessel’s trial run to stall out in an abrupt and unflattering fashion.

As outlined in Star Trek: The Official Starships Collection, early U.S.S. Excelsior concepts devised by Nilo Rodis and David Carson led to Bill George’s own distinctive study model and a 7 ½-foot studio model constructed with the oversight of Steve Gawley. Our first encounter with the starship coincided with the Enterprise’s arrival at Earth Spacedock, resulting in an arduous challenge for ILM — Excelsior needed to appear stationary within the confines of the station’s interior. “[The Excelsior] was shot separately from everything else. [Visual effects cameraman] Sel Eddy shot that stuff,” Ralston told American Cinematographer. “We had to match the moves so that it looked like it was locked right into the space dock. It was a pain. We had to cheat on some of the shots where there was so much trouble with the moves.” Their diligence paid off, as the majestic sequence endures as one of The Search for Spock’s most awe-inducing visuals.

The Excelsior returned in The Voyage Home and The Final Frontier, but it received its biggest chance to shine in The Undiscovered Country, which also featured visual effects by ILM. Now captained by Hikaru Sulu (George Takei), the U.S.S. Excelsior rescued the U.S.S. Enterprise-A during a crucial battle against a rogue Klingon Bird-of-Prey. The model was heavily modified for fresh cinematic escapades in Star Trek Generations, then bearing the legendary registry of the U.S.S. Enterprise-B. The Enterprise-B variant was also utilized as the U.S.S. Lakota, an upgraded Excelsior-class vessel, in Deep Space Nine’s “Paradise Lost.”

ILM’s Excelsior design prevailed via cameos in The Next Generation, as exterior shots of the vessel — now deployed to represent an entire line of Excelsior-class starships — debuted in the show’s first and second season premieres, “Encounter at Farpoint” and “The Child.” These views were subsequently reused as stock footage to depict various Excelsior-class ships in no less than ten additional episodes of the series. As with the Klingon Bird-of-Prey, ILM’s original Excelsior model served as the basis from which all future Excelsior-class physical and CGI builds stemmed. Deep Space Nine aficionados will point to the abundance of Excelsior-class vessels dispersed throughout Dominion War-era battles in “Sacrifice of Angels,” “Tears of the Prophets,” and the series’ finale, “What You Leave Behind,” as evidence that the starships were an integral part of Starfleet’s defense armada. In fact, at least three Excelsior-class vessels stayed in active service long enough to have been prepared to confront the vaunted Borg Collective in Voyager’s own season finale, “Endgame.”

The U.S.S. Grissom in Star Trek III: The Search for Spock.
The U.S.S. Grissom in Star Trek III: The Search for Spock. (Credit: Paramount Pictures)

U.S.S. Grissom: A Scientific Scout

On a research mission to study the Genesis Planet, the U.S.S. Grissom was classified as a relatively small science vessel. After detecting an anomalous lifeform on the planet’s surface and beaming down a landing party consisting of Lieutenant Saavik (Robin Curtis) and Doctor David Marcus (Merritt Butrick), the Grissom remained tragically unaware as Kruge’s Klingon Bird-of-Prey approached under cloak and jammed all outgoing transmissions. The Bird-of-Prey dropped its invisibility field and coalesced into view, pouncing on the Grissom and destroying the Starfleet ship with a single blast.

The Roddenberry Archive notes the U.S.S. Grissom was yet another Star Trek III design conceived of by Nilo Rodis and David Carson and built by Steve Gawley and Bill George. The Grissom stood as a departure from the traditional Starfleet aesthetic in which a ship’s primary saucer was affixed to its secondary hull by a neck-like connection. A gap separated the two elements on the Grissom, with the only structures linking them being thin pylons extending from the vessel’s warp nacelles. The ship’s tragic fate didn’t merely come down to creating the biggest explosion, as plot considerations factored into ILM’s take on the Grissom’s destruction. “I didn’t think we should do something flamboyant at that point,” Ralston pointed out in Cinefex. “If we played all our best cards at the start, we’d have nothing left to show when it came time to blow up the Enterprise.”

The Grissom’s grizzly demise did not spell the end for the distinctive vessel, as the model functioned as the template for what would become known as the Oberth-class starship line. The design reemerged as a different ship of the same class berthed within Earth Spacedock in Star Trek IV before earning a recurring spot as a variety of Oberth-class ships that encountered the U.S.S. Enterprise-D in seven episodes of The Next Generation. The design garnered a great deal of attention in “The Pegasus,” an episode in which it was presented as the U.S.S. Pegasus, a testbed for an illegal Federation cloaking device. One Oberth-class ship assisted in the rescue of the Enterprise-D’s surviving crew at Veridian III in Star Trek Generations, while others could be found in the background at the Battle of Wolf 359 in Deep Space Nine’s “Emissary” and the ILM-orchestrated Battle of Sector 001 in Star Trek: First Contact. Like Earth Spacedock and the Klingon Bird-of-Prey, the Oberth-class design found itself turned into animated form for Lower Decks, this time in the episode “First First Contact.”

Director Leonard Nimoy confers with visual effects supervisor Ken Ralston (and visual effects art director David Carson during a visit to ILM’s Kerner facility.
Director Leonard Nimoy (center) confers with visual effects supervisor Ken Ralston (left) and visual effects art director David Carson (right) during a visit to ILM’s Kerner facility. (Credit: Industrial Light & Magic)

The Search for Spock’s Legacy

Crafting memorable starships and space stations for any production is a tremendous responsibility, yet Industrial Light & Magic’s contributions to Star Trek III: The Search for Spock accomplished this lofty goal and so much more. Having not one, but five major designs go on to resurface in significant roles is an achievement beyond all expectations. A recent scene in Star Trek: Picard’s third season exemplified ILM’s incredible feat, as Kruge’s Klingon Bird-of-Prey and the U.S.S. Excelsior were both positioned around Earth Spacedock as part of the Fleet Museum’s honorary assemblage of classic starships. The everlasting nature of the designs speaks to the eternal appeal of ILM’s work. Whether the new studio models that ILM designed and built for Star Trek III were reused as they were originally constructed, recreated by other visual effects companies at a later date, or called upon by future artists to inspire their own takes on starships, the original models’ extensive influence on the Star Trek universe cannot be overstated.

_

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

The ILM veteran worked on projects like Jurassic Park and The Mask, and helped create a special Halloween poster that inspired the moniker for ILM’s new podcast.

By Lucas O. Seastrom

Listeners of the premiere episode of Lighter Darker: The ILM Podcast may be curious where the show found inspiration for its name.

27 years ago in 1997, Industrial Light & Magic was in the midst of the digital renaissance in visual effects, with projects as diverse as Men in Black, Contact, and Titanic being released that year, and others like Deep Impact (1998), Saving Private Ryan (1998), and Star Wars: The Phantom Menace (1999) readily underway. ILM’s then home at the Kerner facility in San Rafael, California was a bustling center of creativity across both digital and practical disciplines, and arguably the most exciting time of year came during Halloween season when hundreds of employees and their families gathered for an annual costume party.

Benton Jew had been with the ILM Art Department since 1988, working as a storyboard and conceptual artist on everything from Memoirs of an Invisible Man (1992) to the fabled television commercial where basketball star Charles Barkley plays one-on-one with Godzilla. Later, his responsibilities grew as he became a visual effects art director on projects like The Mask (1994). But there was always time for side projects in between work for clients, and in 1997, Jew was asked to illustrate that year’s invitation and poster for the fabled ILM Halloween Party.

The 1997 ILM Halloween Party poster, featuring artwork by concept artist Benton Jew. An artist is seen clutching his computer while horror characters tell him "lighter" and "darker."
The poster for the 1997 ILM Halloween Party, designed by Mark Malabuyo and illustrated by Benton Jew.

Designed by the late Mark Malabuyo, an in-house graphic designer at the time, the poster was envisioned in the style of the classic EC Comics of the mid-20th Century, known for their horror and science fiction stories. This “issue,” dated from October 1997, is entitled “Tales of Terror: Attack of the Nitpickers,” with the subtitle, “Producers! FX Supervisors! Art Directors! Nitpickers all!” Jew’s illustration below depicts a visual effects artist surrounded by figures caricatured in the horror style. He frightfully grasps his computer as the onlookers share their feedback about his work. “…Lighter…” one says, while someone else contrasts with “…Darker…” Yet another recommends, “…Split the difference…”

This tongue-in-cheek bit of satire about the collaborative process of visual effects would have inspired a chuckle from just about everyone at the company, and two of its word bubbles have now become the namesake of ILM’s new podcast. “I just wanted to capture that look of someone when a person comes by their workstation and points something out, or someone’s expression in dailies,” Benton Jew tells ILM.com. “It seemed like that kind of situation came up a lot in CG. Okay, no one can make a decision, let’s split the difference.

“Especially as an art director, I think I’d been seen as someone who would pick out little things,” Jew continues. “I’d be on The Mask or something, and would tell an artist, ‘Oh those icicles, they’re not quite right,’ or whatever it is. I’m sure that anybody who’s worked in CG can relate to that. People are always pointing and giving you backseat directions.”

Artist Benton Jew.
Benton Jew

Known within the department for his versatile and prolific output, Jew was also a lifelong comics fan, an attribute that earned him the Halloween Party assignment. “I was sort of the resident comic book geek,” he explains, “and obviously a Halloween piece would have an EC Comics theme to it. I tried to be in the spirit of artists like Jack Davis, Jack Kamen, and Graham Ingels. Mark Malabuyo was the graphic designer on it. He was a wonderful guy, so easy to work with. He was really jovial and friendly. We all miss him. He was set on making sure that the graphics had a fidelity to the old EC stuff. He made it as close as he could, with obviously some differences.”

Growing up, Jew had first aspired to be a comics artist. Then, as he puts it, “Star Wars happened.” The 1977 feature film launched the cinematic dreams of many younger viewers at the time, including Jew and his twin brother (who also became an illustrator). “We saw all the books on the making of Star Wars with Joe Johnston’s storyboards and Ralph McQuarrie’s drawings, and got hooked into amateur filmmaking. For people who grew up in that era when Star Wars came out, it really sparked a craze for people to want to be filmmakers.”

While studying at the Academy of Art in San Francisco with teachers like celebrated poster artist Drew Struzan, Jew was recruited into ILM’s ranks courtesy of storyboard artist Stan Fleming, who’d contributed to projects like Indiana Jones and the Temple of Doom (1984). Jew loved cinema, but never lost his passion for comics and illustration. “When I started working there, most people were from the car design world,” he explains. “They weren’t necessarily drawing figurative work. They were doing architectural or vehicle-driven stuff. As things became more creature-based in visual effects, being a general illustrator worked well for me. I can’t draw a vehicle to save my life.”

From the beginning, Jew worked as a storyboard artist, directly applying his knowledge of comics to another mode of visual storytelling. Among others, he’d eventually board for director George Lucas on The Phantom Menace. “With George, all of us would sit and do thumbnails with him. But I’ve worked with plenty of directors like that where I’ll sit with them and draw lots of tiny thumbnails really quickly, and then I’ll go back and flesh those boards out later. With George, we met with him twice a week for quick little meetings. He’d basically tell us the story, and we’d all draw out different ideas and he’d make suggestions. Then we’d have this huge stack of thumbnails, and we’d get them in correct order, and someone like me or Ed Natividad or Iain McCaig would make finished drawings from those.”

Concept art of Milo the dog from The Mask, by Benton Jew.
An example of Jew’s concept art, a sketch of Milo the dog from 1994’s The Mask.

The digital renaissance led to a surge in projects requiring CG creature development, from early entries like The Abyss (1989) and Jurassic Park (1993), to even more ambitious projects like Dragonheart (1996) and The Mummy (1999). Jew had a front-row seat during this storied period that introduced new tools and tumultuous change. “My first real film was Ghostbusters 2,” he recalls, “and that was still done with foam and rubber and stuff like that. I got a pretty good idea of what that was like. I could see CG slowly coming into view. It was really a magical time and everything was changing by leaps and bounds.

“I would go down to ‘The Pit’ and watch Spaz [Steve Williams] creating those dinosaurs that he would later show to Spielberg and company,” Jew continues. “It was so weird when Jurassic Park was being made because you had to sit on this and not tell anybody, and you knew it was going to change the world. As the technology kept improving, it wasn’t replacing the artists and filmmakers; it was helping them. It’s about giving them the tools to make something that they couldn’t make with traditional means…. John [Knoll] would come by and ask us what we wanted to see in Photoshop. He meant for it to be a tool for us, not a replacement. Our pallet was growing larger.”

Departing ILM in 2001 after 13 years with the company, Jew headed for Los Angeles where he continues to work on feature films as a storyboard and concept artist. He’s also self-published comic books of his own, as well as contributed to comics for Marvel, among others. Jew still gets questions about the memorable “Tales of Terror” poster (and remains adamant that the terrified artist clutching his machine is not based on anyone in particular). Looking back on his ILM days, Jew values the artistic lessons granted him by the experience of working on so many different assignments.

“Just the idea of having to do a lot of stuff very quickly impacts how you draw,” he concludes. “You learn to do more shortcuts, what to leave in, what to take out, and things like that. Early on, I didn’t do a lot of paintings. Most of my stuff was black and white, but I learned to do more color stuff when they asked me to do it. The volume, speed, and needs always change, so you just stay flexible. As an artist or an art director, the most important thing is not your eyes or your hands, but your ears. To understand what the director or effects supervisor wants, you need to develop your ear more than anything. It’s learning what they want and how to do it correctly. It may not be your own taste, but you need to be able to talk to them and know where they’re trying to go with it.”

Lucas O. Seastrom is a writer and historian at Lucasfilm.

We are pleased to announce Lighter Darker: The ILM Podcast, where we focus on the creative process of filmmaking and the art of visual storytelling. Hosted by ILM Chief Creative Officer Rob Bredow and ILM Compositing Supervisor Todd Vaziri, and produced by Jenny Ely, we share behind-the-scenes stories that illustrate the many crafts that come together to create a motion picture, TV series, or special venue project.

Subscribe Now: Spotify | Apple Podcasts | YouTube | Amazon Music | iHeart Radio | SiriusXM | Pandora | RSS

Whether you’re a seasoned professional, an aspiring filmmaker, or a fan of immersive experiences, Lighter Darker provides valuable insights, inspiration, and a deeper appreciation for the artists behind the projects we undertake at ILM in visual effects, animation, and immersive entertainment. We have a terrific lineup of special guest filmmakers who join the team for upcoming episodes to discuss the creative process of filmmaking and the art of visual storytelling.

The premiere episode drops on Tuesday, September 3, 2024, and will be available wherever finer podcasts are offered.

Subscribe Now: Spotify | Apple Podcasts | YouTube | Amazon Music | iHeart Radio | SiriusXM | Pandora | RSS

For the first time, ILM’s groundbreaking virtual production technology transports fans inside the Star Wars galaxy.

By Clayton Sandell

Patricia Burns gets ready for her closeup on the ILM StageCraft volume at D23.

Patricia Burns steps up to her mark.

Dressed in the sleek all-black uniform worn by the Third Sister Reva Sevander from Obi-Wan Kenobi (2022), she ignites her doubled-bladed red lightsaber and waits for her cue.

A nearby stagehand counts her down and calls “Action!”

As a crane-mounted camera swoops in, Burns crouches next to R5-D4, a red and white astromech droid, swinging her lightsaber with a fierceness only a Jedi-hunting Inquisitor could conjure. Behind her, a massive wall of LED screens displays the pristine moving image of a busy Rebel hangar.

Monitors around the stage show what the camera sees in real-time: an epic, trailer-worthy shot that makes Burns the star of her own Star Wars story.

“Oh, it was awesome,” Burns tells ILM.com as she walks off the stage, grinning. “A chance of a lifetime.”

At D23: The Ultimate Disney Fan Event, Burns and hundreds of others had the unique chance to perform on a StageCraft volume— Industrial Light & Magic’s cutting-edge virtual production technology used on dozens of projects including The Mandalorian (2019 – present), Percy Jackson and the Olympians (2023 – present) and The Batman (2022).

For the first time ever, the ILM crew assembled a volume— something normally sequestered on an off-limits studio soundstage— inside the Anaheim Convention Center just for fans attending D23.

ILM’s chief creative officer Rob Bredow and virtual production supervisor Sonia Contreras host a StageCraft workshop.

“I think everybody is blown away by the scale of this, and how immersive it actually is when you get to see it here on the show floor,” says Rob Bredow, senior vice president, creative innovation for Lucasfilm and chief creative officer of ILM.

During the three-day event, a rotating trio of scenes appeared on the volume’s giant LED panels: an Imperial hangar created for The Mandalorian, a Rebel hangar from Ahsoka (2023) and a vibrant city street on the planet Daiyu seen in Obi-Wan Kenobi.

“You’re looking at over 18-and-a-half million pixels of LED wall and a live-tracked camera,” Bredow tells ILM.com. “Wherever the camera looks, we get a high-fidelity version with exactly the right perspective for the illusion of creating an immersive environment. It looks impressive enough here at the convention center but when we collaborate with the production designer and the art department on one of our productions that’s when the technology really sings. It’s a powerful tool in the filmmaker’s toolbox that we can deploy when building standing sets on a stage or traveling the cast and crew to a far-flung location isn’t feasible.”

For D23, ILM wanted to demonstrate a fully functioning StageCraft volume exactly like the ones used on a real set.

“It’s very fun to not be faking it,” Bredow quips.

Attendees at D23 take in ILM’s StageCraft volume.

ILM virtual production supervisor Ian Milham says transporting the volume from a studio lot to the convention center took a herculean scheduling and logistical effort involving a busy team of artists, engineers, and crew members. And several large trucks.

“Everybody agreed, ‘yes, we’re really going to do it’,” Milham explains. “But that meant we had to get our real gear and our real crew here. It also meant we couldn’t be making a movie with it at that time.”

The challenge was worth it, Milham says, because it gave the filmmakers a chance to finally show off their pride in StageCraft to a wider audience.

“Film sets are amazing places,” says Milham. “But it’s not like there’s a lot of chances to really share our success. So we’re really happy to be able to show the public for the first time the cool results, but also what it takes to pull off something like this and how much teamwork and technology it takes to do it.”

ILM virtual production supervisor Ian Milham demonstrates the volume.

ILM virtual production supervisor Sonia Contreras co-hosted several StageCraft presentations with Bredow. The pair challenged the D23 audience to look at several scenes and guess which elements were created with practical set pieces and props, and which ones were generated by the volume.

“I got about a third of them right,” laughs Ryan Schwartz, who watched the demonstration with his wife Katie and sons Zachary and Jonathan. Katie says she fared slightly better, guessing about half correct.

“I’ve been following ILM for a long time, and I still try and figure it out,” Ryan tells ILM.com. “They’re so amazing in their craft that it’s so hard to really piece together what is real and what is digitally done.”

Contreras says the D23 StageCraft experience is extremely special because even some ILM employees still haven’t been able to see the volume work in person.

“I would hope that people take away that there’s a lot of brains that go into making this happen,” Contreras says, pointing to the setup’s real-time rendering, camera tracking, processing power, and an aptly named “Brain Bar” crew working behind the scenes to help make the scenery so seamlessly realistic.

“The ‘wow’ factor is when you get to see what’s actually happening, all the different things that are getting coordinated in order to make that image work,” Contreras says. “It’s really cool to be able to show it to everybody.”

Lucasfilm senior vice president and executive design director Doug Chiang made a special appearance in front of a packed crowd on Saturday to talk about StageCraft’s contribution to the long history of visual effects filmmaking.

“We rarely get to share it or talk about it, because it’s an evolving technology, and it is just a tool,” says Chiang. “But at an event like this, where we can actually finally get under the hood and share the magic with the audience, it’s just terrific.”

Lucasfilm executive design director Doug Chiang and Lucasfilm Art Department associate producer Michelle Thieme in the volume.

Frequent ILM collaborator Legacy Effects also pulled the curtain back to show how their crew helps create a Star Wars galaxy full of creatures, aliens, and droids.

“When you’ve got leaders like Jon Favreau and Dave Filoni, who just embrace everyone’s contributions, it inspires you to do the best work that you can,” says Legacy Effects co-founder and special effects veteran Alan Scott.

At D23, Scott and the Legacy team explained how they bring life to characters like the silver professor droid Huyang (voiced by David Tennant) and Murley the Loth-cat for Ahsoka. The production relies on a combination of practical puppets along with digital versions inserted later, depending on the requirements of each shot.

“There are things that I think practical can do very well, especially when it comes to the interaction with the performers,” Scott tells ILM.com. “Then there’s a responsibility that says, ‘that would be better if it was done with visual effects.’”

Legacy Effects co-founder Alan Scott (left) demonstrates a character prop with colleagues Dawn Dininger and David Covarrubias.

Bredow hopes that revealing how some of the Star Wars magic is made might inspire others, especially kids, to consider working in visual effects.

“Many people don’t even realize there are these very artistic and very technical and very creative jobs that have to do with working behind the scenes of film and television production,” Bredow explains. “So this is one of the fun things to do. To connect with fans, to connect with people who might want to make this a career.”

Cosplaying as Bastila Shan from the Knights of the Old Republic (2003) video game, Star Wars fan Carly King says she was most impressed by StageCraft’s powerful mix of creativity and engineering.

“It just looked so good on the screen. It’s so interesting to see how this whole conglomeration of electronics and technology comes together. It’s an incredible thing,” King says. “It’s one thing to watch Star Wars, but it’s another thing to be in it.”

Clayton Sandell is a television news correspondent, a Star Wars author and longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design.

After 25 years at ILM, Cooper has earned a reputation for seeking out the most efficient solutions to creative problems.

By Lucas O. Seastrom

Back in 2002, Industrial Light & Magic’s Jay Cooper was a compositing sequence supervisor on Master and Commander: The Far Side of the World (2003). For a time, director Peter Weir joined the ILM crew at their offices on Kerner Boulevard in San Rafael, California. “We had a shot when the mast of one of the ships falls over,” Cooper tells ILM.com. “There’s all this gunfire. It’s completely enshrouded in smoke. As I’m working on it, Weir comes to my desk and he says, ‘I want it to look like a beautiful nightmare.’ I was like, ‘Wow, that’s cool. Now what does that look like?’ [laughs]”

Over the past two decades, Cooper has moved into the visual effects supervisor role, working on projects as varied as Eternals (2021) and Babylon (2022). Most recently, he partnered with writer/director Gareth Edwards on The Creator (2023), a science-fiction tale with an unconventional visual effects methodology. As he and the ILM crew navigated the challenges of integrating effects into location photography with minimal reference data, Cooper managed to connect with Edwards in a way that reminded him of his experience with Peter Weir.

“Normally, as a visual effects supervisor, you’re being much more granular in your notes, lots of technical conversations,” Cooper says. “You don’t usually engage with artists in an emotional way. That’s what is really wonderful when you’re exposed to working with directors. That’s my favorite part of being a supervisor: you’re not always in the weeds talking about those details, you’re trying to engage with it at a story level. That’s the part that artists love. Gareth partnered with us in that way, and people got really excited about the project. Fun things happen when people get excited. They sneak in extra takes. They devote themselves in a huge way. We asked people to do really hard stuff without all of the support materials. If they know what we’re trying to achieve and we’re all pulling together, it can help make up for those shortcomings.”

At the beginning of the project, ILM’s chief creative officer Rob Bredow asked Cooper to meet with Edwards and producer Kiri Hart. “Gareth said, ‘Hey, I’ve got this movie and I hear you’re the guy who likes to cheat,’” Cooper says with a laugh. “He said that probably in the most affectionate way. I’m not really a devotee of any sort of process. I worship at the altar of whatever we can do as quickly and as simply as we can do it. As an artist, that was my forte. I did lighting and compositing, and I would try to navigate as many shortcuts as I could. I guess my reputation as a visual effects supervisor was that I’d work on shows with really small budgets and we’d try to wring out whatever production value we could. I think that’s why Rob put us together.”

Director Gareth Edwards operating the camera on location in Asia during production on The Creator.

Edwards’ vision and Cooper’s style were in tandem. In terms of workload, The Creator would be Cooper’s biggest project to date as a visual effects supervisor. “One of the best pieces of advice that [ILM executive creative director] John Knoll ever gave me,” Cooper notes, “was that you take big problems, break them into smaller problems, and smaller and smaller. So we created teams to hit different problems. We knew that we were going to be behind the 8-ball. We knew that Gareth had a smaller-than-desired budget, and he came to us wanting to partner in a different way.”

Edwards had been a visual effects artist himself before taking the director’s chair full-time. In his 2010 feature directorial debut, Monsters, he famously created many of the visual effects on his own. For Cooper, this practical experience helped define ILM’s approach to crafting visual effects with a “scrappy” sensibility. Shooting primarily on location in Thailand, Edwards focused on capturing his actors and the dramatic landscapes where they played out their scenes. Traditional effects tools like bluescreens and tracking markers would be almost completely avoided, and ILM would need to integrate their CG elements without the normal reference tools.

Looking into the ILM StageCraft volume during production on The Creator.

“Most of the time doing visual effects work, it’s very much a spreadsheet problem. You have seven robots at this amount of money, or fifteen environments at this scale at this amount of money. Even at the bidding stage for The Creator, we were instead asking what we could do for a certain amount of money. Just as a scrappy filmmaker, Gareth wanted to know what was possible in visual effects if we used different techniques and structured the show differently. 

“If we take a whole sequence,” Cooper continues, “Gareth would explain how there’s only so much information you can take in during one shot, so let’s put everything together, bring it all up, and water the one element that’s dying. If you didn’t feel like there were enough robots here, how much do you need to add? Where’s your eye going to go? If a frame feels empty, what can we add? Is there a way to add something that avoids a roto-nightmare? Can we structure it so we don’t see the element in one shot but we do see it in the next two shots so that you sort of complete what the image is? Loosely, that’s how we went off and did the work.”

Much of that questioning and analysis was open to the larger visual effects crew. Initially, Edwards had planned to embed himself within ILM’s studio to personally oversee the work. Although pandemic concerns ultimately scratched that idea, he still welcomed artists from deeper in the ranks to present their work directly and share ideas. 

Gareth Edwards discusses a scene with John David Washinton in the ILM StageCraft volume.

“It takes a rare person to be comfortable enough to share your feedback openly with artists on the production,” Cooper notes. “It’s really wonderful. You get a level of engagement that you may not always find. Sometimes working on blockbusters, you can feel like you’re just punching numbers. But if you expose the artists to the reasoning behind something, the filmmaking intent, you get a huge level of engagement.”

As visual effects supervisor for the entire production, Cooper was busy overseeing work not only at ILM’s studios in San Francisco, London, Sydney, and Vancouver but also the assortment of smaller vendor studios enlisted to assist on the project. The initial shot count estimate had more than doubled by the time Edwards shared his initial cut. As he points out, ILM contributed “about 95% of the asset work and the lion’s share of the shot work” with the support of the vendors. 

“As a supervisor, I’m sort of tapping the boat,” Cooper says. “You can’t be in every single file to model the rivets. You can’t go into every composite to add the elements. You’re asking for degrees of one thing or another, and there are a lot of places where people are volunteering an idea. They’re doing it in a way that they understand what the stylistic or aesthetic goal is.”

Overall, Cooper’s experience on The Creator felt like a return to an earlier era in visual effects, one that speaks directly to ILM’s can-do spirit. “ILM tries to find projects that are outside of the comfort zone of what has happened previously. It must have been wonderful in the late ‘80s or early ‘90s when the question wasn’t ‘can you do this?’ It was, ‘is this even possible?’ Those times have ended in many different ways. You do it enough times, and there’s a cost structure around it. So it’s interesting to be on a project where you chuck a lot of that away and get back to the basest level. We have a pot of money and a director with some big ideas. That’s the launching point. It’s cool and exciting to be in that world again.”


Read more about ILM’s work on The Creator with more from Cooper and his team.

Lucas O. Seastrom is a writer and historian at Lucasfilm.

Wednesday evening, the Visual Effects Society (VES), the industry’s global professional honorary society, held the 22nd Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry, and innovation in film, animation, television, commercials, video games, and special venues.

ILM’s work on The Creator earned six nominations and won four awards including the coveted top prize Outstanding Visual Effects in a Photoreal Feature, Outstanding Created Environment in a Photoreal Feature, Outstanding Model in a Photoreal of Animated Project, and Outstanding Effects Simulations in a Photoreal Feature, with a partner company winning a fifth award for the film for Outstanding Compositing and Lighting in a Feature.

ILM’s work on Darren Aronofsky’s groundbreaking film Postcard from Earth won for Outstanding Visual Effects in a Special Venue Project while The Mandalorian (Season 3) won for Outstanding Effects Simulations in an Episode, Commercial, Game, Cinematic or Real-Time Project.

“It’s a true testament to our amazing global teams that our work was honored by our industry colleagues on the winning shows noted above as well as the ILM shows that were nominated, including Indiana Jones & The Dial of Destiny, Ahsoka, Willow, Mission: Impossible, Dead Reckoning Part One, Dungeons & Dragons: Honor Among Thieves, Napolean, Killers of the Flower Moon and Guardians of the Galaxy, Vol. 3” said Janet Lewin, ILM General Manager, “I couldn’t be more proud of our teams.” 

Visual effects supervisors Jay Cooper, Andrew Roberts, Charmaine Chan, and Ian Comley take us behind-the-scenes of an unusual visual effects challenge.

By Lucas O. Seastrom

Ever since George Lucas and John Dykstra sat down in 1975 to discuss the former’s vision of capturing dynamic aerial dogfights between miniature spaceships in Star Wars: A New Hope (1977), Industrial Light & Magic (ILM) has made an art of solving creative problems in close partnership with filmmakers. As Lucas’ vision challenged ILM’s capabilities nearly 50 years ago, The Creator (2023) writer/director Gareth Edwards proposed an unconventional approach to filmmaking that would keep the visual effects crew on their toes.  

Proof of Concept

Edwards first collaborated with ILM on Rogue One: A Star Wars Story (2016), channeling the same rebel spirit of Lucas’ A New Hope. Envisioning his own science-fiction tale in The Creator, he would also channel Lucas’ audacity for pushing the limits of ILM’s capabilities. It began some years ago when he asked ILM’s executive creative director John Knoll (who supervised the visual effects for Rogue One) if the company was able to assist with a test reel that would demonstrate Edwards’ vision for a movie about a futuristic Earth where humans and artificial-intelligence lived side-by-side.

“Gareth and his producer [Jim Spencer] went to Asia on what he described as a scout, but he also brought a camera along,” explains Jay Cooper, who would become The Creator’s overall visual effects supervisor for ILM. “He shot in a number of different locations to create a sort of think-piece, very documentary-style footage. Then he came to us asking to put some 50 shots together, which John supervised.”

Edwards provided his footage only. There was no accompanying data, no lidar scans, no HDRI captures of environments, none of the usual resources that visual effects artists rely upon. The challenge was to integrate digital elements – characters, vehicles, and locations – into the existing footage, including the replacement of real people, or components thereof, with robotic technology. “We tried to create rapid prototypes of what shots could look like by doing them in a more heavily-2D way,” Cooper explains. “We’d take frames, do a draw-over with James Clyne, who became the film’s production designer, and with a bit of fast projection work, get them into shots. We got a really convincing look with a modest amount of effort. Gareth explained that shots that usually take two or three months of work could be seen in three or four days.” 

The proof-of-concept not only sold Edwards’ backers to make the film, but gave ILM a model for developing effects on a feature-length scale in this unusual, after-the-fact method. The Creator would be shot primarily on location in Thailand with a small crew and fewer resources. “Gareth wanted to shoot this ambitious movie,” says Cooper. “The artwork was phenomenal, but the catch was that we’d be really uncomfortable because we wouldn’t be given the things we were used to. We wouldn’t stop for a clean pass every time. We wouldn’t always know what all the shots we’re going to be because those would be determined in the edit. There were enormous designs for the scope of the movie. This was a big swing, it was sink or swim. So off we went.”

On the Ground in Thailand

Edwards remained committed to maintaining a fast, improvisational shooting style, often handling the camera himself. He would not inhibit his ability to engage in the moment with his actors, who included John David Washington as the protagonist Joshua, a world-weary soldier in search of his lost love, and Madeleine Yuna Voyles as Alphie, an artificial simulant in the form of a young girl who acts as both the story’s heroine and MacGuffin. Instead of the usual small team of visual effects personnel, ILM would send just one representative to Thailand, visual effects supervisor Andrew Roberts.

Roberts would be responsible for both consulting with Edwards and crew, including cinematographers Greig Fraser and Oren Soffer, as well as capturing as much data for each respective shot as he possibly could. “I was there to help make sure that things were filmed in a way that would give ILM the best chance of producing great, photoreal work,” Roberts says. “I wasn’t going to get in Gareth’s way.

“Early on, we had scenes with robots and humans existing together,” Roberts continues. “I asked Gareth which of the actors would be made into robots so I could mark them. Even if we’re not putting them in the motion-capture suits, I could take measurements and make a turntable, all to give the team information. Gareth looked at me and said, ‘Don’t know.’ It wasn’t something he wanted to focus on. He would pick actors to make into robots later. I wasn’t sure how aggressively he was going to create negative space with these characters. It turned out that their bodies were more or less the same, and you’d mainly see the mechanism when it came to their arms and their heads. But I still didn’t know at the time, so I recorded the information about where Gareth was pointing the camera and determining what backgrounds I needed to capture to reconstruct a clean plate.”

Another major challenge involved the simulants, A.I. characters who appear human, save for the aft portion of their heads, which feature a bold mechanical structure. Edwards and Clyne had created initial concept art, but it was left to Roberts and Cooper to determine the best means to track the live actor’s facial movements onset in order to integrate digital components during post-production. 

“I think the movie doesn’t work at all if you can’t get a convincing Alphie,” says Cooper. “It’s where your eye is looking. There are 400 shots of her. In prep, I pitched the idea of putting a sock over her head and dressing the edges of where the contours are so that we know exactly how to define the delineation point between where her mechanical components connect to her skin. I asked about doing makeup to address the edges, and we could fill in the rest. Gareth said, ‘No, we’re not going to do that because I need her.’ When you’re working with a child actress, there’s only so many hours you can work. He wanted her onset for every minute she could be. 

“Then we had to figure out what we could do in terms of tracking dots that were low impact and didn’t interfere with the acting,” Cooper says. “I explained the ask to [layout supervisors] John Levin and Tim Dobbert, and said that I didn’t know exactly what the designs were going to be, but they said, ‘Well, let’s put some tracking dots on the bridge of her nose, one on the temple, a couple on her neck, and we think we can figure that out.’ So that’s what we did! [laughs] It’s a leap of faith.” Roberts then collaborated daily with the makeup department to place tracking dots on the simulant actors, each of whom required a unique arrangement because of their varying physiques.

“The benefit of having someone like Gareth is that he used to be a visual effects artist and he has a clear idea of what the end result will be,” explains Roberts. As an example, he explains how Edwards shot an early moment in the film when Joshua watches the suborbital ship NOMAD launch a missile at a group of small vessels just offshore. “Gareth knows that he wants NOMAD to be in frame, so he’ll frame for it and then tilt down to Joshua watching from the beach. Another director might be focused on the action in front of them, and in post they’ll ask if we can extend the frame and create a digital move. The majority of directors don’t think about those things in advance. So when I’m observing a shot like that where Gareth is tilting the camera, I’ll wait for the cut and ask him, ‘During that tilt, what are you seeing?’ Then I’ll make notes.”

After crisscrossing Thailand, often covering multiple locations in a single day, cast and crew traveled to Pinewood Studios in the United Kingdom, where ILM had contstructed a StageCraft volume as part of its virtual production toolkit. There, two major sequences were captured for the end of the film, when Joshua and Alphie board NOMAD. “It takes a lot of work to do StageCraft correctly,” notes Cooper, who used the tool for the first time on this show (as was the case for Edwards). “You have to be very careful that it’s the right fit. If I think about our goal as a movie, which was to always find real locations, there were only a couple of places where there was no equivalent location, and that is space. It made a lot of sense to use StageCraft for the NOMAD’s Biosphere environment and the Air Lock, where either the scope is so large that it would be cost-prohibitive to build a physical set, or the aesthetic goals would push you into doing a full bluescreen shot.”

A few smaller scenes were shot on an adjacent Pinewood stage equipped for traditional bluescreen or greenscreen, but as Roberts points out, the crew took the chance to innovate some distinct techniques. We had a scale portion of the missile that Joshua climbs on,” he explains. “We created interactive lighting for that by taking portions of the real-time NOMAD model from Gareth’s virtual production scouts, and animated them to enable these mechanisms pushing missiles into place. I had this little animated sequence, which I then rendered out as a black-and-white texture that had different layers of structure moving past, which imagined that the sun was out in space and these things were casting shadows. We connected that from my laptop to a 12K projector that was mounted on the set. So when John David is hanging on the exterior of the missile, we have real light interacting with him in the close-ups. That evolved quite organically.”

Altogether, 2022’s principal photography lasted some 80 days, not including an additional round of element shoots and pick-ups led by Edwards with an even smaller crew across multiple Asian countries. 

A Global Collaboration

ILM’s studios in San Francisco, London, and Sydney would each make significant contributions to The Creator, with additional support from the Vancouver studio and an array of vendors. In October of 2022, Edwards came to ILM San Francisco to screen a three-hour cut of the film. “Everyone came out recognizing that it was something different and special,” recalls London-based visual effects supervisor Charmaine Chan. “It was a lot more than we thought it was going to be. Originally, we estimated around 700 or 800 shots. Watching that cut, we knew there were more than twice as many. So the question was how to handle that and deliver on time, on budget, and at the quality we always want at ILM. We had to set guidelines with Gareth about how we’ll be able to get this film across the finish line, and he was very receptive to it.”

Cooper’s proposal was an unusual “three-strike system,” where Edwards would be given three opportunities across the life of a given shot to provide notes, allowing ILM to iterate with as much focus as possible on the key elements of that shot. “That’s the optimal structure to ensure that all the money goes into getting a clear direction for the shot,” Cooper notes. “We were probably only successful doing that about 70% of the time, but there were a healthy number of shots where, after we solved the questions about the simulants, for example, Gareth knew that if we kept to those standards, we wouldn’t be chasing really small details.”

The design evolution for the simulant head mechanics resulted in an elegant approach that felt almost human. Building on techniques first employed by ILM for The Irishman, the team was able to seamlessly blend the movements of the actor’s skin with the rigidity of the rear components. “We’re trying to empathize with these simulants and understand what they’re going through,” explains Chan. “When you first see one, it’s just another human being, then they turn to profile and you realize it’s something else. Because the performances are so good, whether it be Madeleine or Ken Watanabe [Harun], you’re focused on them and feeling their emotions and you forget about all that gear.”

Many subtleties were incorporated into the headgear to complement the performances, including character-specific details, such as the battle-worn tech of Harun’s components. The animation team were responsible for creating tiers of almost subliminal movements that reflected each simulant’s emotional state. “When Alphie stops the bomb robot, for example, that’s full pelt as the mechanics whir up, which includes wonderful sound effects,” explains Ian Comley, also a London-based visual effects supervisor. “For everything else, it’s a kind of Swiss watch, cogs and gears ticking, something always active, but in a more gentle way.”

Throughout post, the ILM crew enjoyed an extraordinary level of direct access to the director and production designer. “It can be pretty rare to feel like you’re a core member of the filmmaking team,” says Comley. “I can’t think of another film where the production designer has stayed on until the very last shot. To build every robot, simulant, vehicle, prop, and structure required a lot of design. Gareth knew what he wanted and had a great relationship with James, and we were a part of that. We took initial concepts from James, tried to riff off them, and fleshed them out into assets. We could then share back directly with James, who could go in and do paintovers. With direct access, there’s no diffusion of ideas. Instead, it’s collaborative filmmaking.”

To create the full-body A.I. characters as replacements for select live actors, ILM developed seven distinct robot designs following Edwards and Clynes’ visual methodology, which combined a 1980s technology aesthetic with organic, natural influences. Each design could be made unique with specific flairs, often informed by the individual character, such as with Amar Chadha Pata’s performance as Satra. 

“Amar is very expressive in his face,” explains Chan. “When he’s talking or thinking, his eyes and eyebrows say a lot. We captured that in the actor, but how do we present that in a robot? [Animation supervisor] Chris Potter was brilliant in suggesting all of these fine details, like in the eyes, which are very tiny on Satra. You can see a slight pupil and see eye darts when he’s thinking. The mouth was also slightly hinged, so all these little characteristics of Amar’s performance can come into this robot to show his emotions.”

Comely points out the “masterstroke” of Edwards’ decision to not decide on the robot characters while filming. “Even when it came to background characters, a typical film would decide who would be a robot and kit them up in mo-cap pajamas,” he explains. “None of that on this show. Gareth got naturalistic performances because people were just moving as people. If it was a scary scene, they acted scared with those fluid motions. No one had been told they were a robot and then acted twitchy or jittered, the kinds of things you might do. 

“It also gave ILM license to switch out anyone,” Comley continues. “If he had picked someone onset to be the robot, it might turn out that that person isn’t located where your eye goes in the shot. The real person you want to be a robot is on the other side. We had the freedom to do that, which was a real challenge, but we could decide with Gareth after the fact which ones to choose. As shots changed, we could keep adjusting. The matchmove and paint teams did a fantastic job. The performances were so grounded, and we did very little to change that. The last thing Gareth wanted was for us to take a brilliant natural performance and turn it into a stereotypical robot. It was mostly heads and arms. There are instances with full-body robots, but by and large, they were additions instead of replacements.”

The London crew under Chan and Comley’s supervision spent considerable time on act three aboard the NOMAD, where the key challenge was to create fully-CG assets and environments that felt akin to Edwards’ naturalistic shots on real-world locations. On a typical show, ILM often incorporates grain or lens flares to match the source photography, but for The Creator, they also helped influence creative decisions to help bridge the divide between Earth and space, including moving the NOMAD into lower orbit where more diverse colors and atmospheric elements could be incorporated. “It helped marry the story points where people on the ground are able to see NOMAD above,” as Comley notes.

Even in a traditional CG scenario, ILM found ways to empower Edwards’ freeform shooting style. “Besides the real-time rendering and LED walls, the StageCraft suite also includes virtual cam sessions,” explains Chan. “The whole exterior of NOMAD was pure CG, so Gareth was able to hold an iPad and look around to see the different sections of the ship and frame his shots, from the wings to the central section that we called the ‘bunny teeth.’ We saved so much time with Gareth being able to do that, rather than having us propose specific framing ideas. With Gareth being a visual effects artist, he just grabbed it and started making choices.”

At times, Edwards even embraced the most ordinary of methods to convey his vision. For the sequence when Joshua attempts to climb onto one of NOMAD’s towering missile silos, the director provided reference footage by “taking a wastepaper bin with a water bottle inside for the missile and a little LEGO figure taped on,” as Comley explains. “He shot it all with his iPhone. It had the same principles of photography that he’d applied in the v-cam. You have to feel like there is an operator discovering the events as they unfold. Gareth’s philosophy was often to think that the operator was hanging out of a fast-moving plane because the NOMAD is so big, that’s the only way you could do it.”

By the spring of 2023, ILM had completed some 1,700 shots for The Creator (a handful of which came from Edwards’ original test reel). “We made some good choices in terms of how to build this whole train set,” explains Cooper. “Maybe the most important one was that James Clyne had a concept team all through post-production. In visual effects, it can get expensive area is when you don’t know what you want, and you iterate multiple times and change directions. Normally there’s a bunch of concept art and you spend your time chasing that. We had existing concepts, but once the movie was shot, James kept reinterpreting it. When we’d land on an idea, we already knew the shot, the camera work, and we could deploy our resources accordingly. Sometimes it’s a 3D asset that we build because it’s going to be in 40 shots. Other times we take the art model from James’ team, put it into the shot, they paint on top of it, put it back in the shot once more, and it’s done. Not standard procedure at all. It’s all about looking for those opportunities.”

Looking for the Next Challenge

The Creator’s unconventional production methods were successful not only in terms of the efficiency of its budget and resources, but in the ability of the artists on every level to make genuine contributions to the story. That came from Edwards’ example and leadership. “Everyone wanted to be on this project to the point where someone would roll off the show and keep asking if they could do one more thing on a shot, just to make it a little better,” says Chan. “Sometimes you can feel like a cog in a machine, just pushing buttons, but this was the opposite. Everyone on every level felt that they could be creative and suggest ideas.”

ILM was established to create solutions that respect the integrity of a filmmaker’s original vision. For an artist like Comley, the willingness of the filmmaker to include ILM in that visionary process is much more important than the actual problems to be solved. “One way or another, we can paint out that thing, track that thing, come up with a creative solution,” he notes. “Throw us anything you have. I’d rather have that and the vision and richness of photography than a clinical greenscreen and a question mark.”

It was a refreshing experience for everyone, but one critically dependent on the filmmaker. “You have to be willing and able to take this gamble, and it’s hard to find both things together,” says Cooper. “There are a lot of filmmakers that are willing but because of the studio constraints around them, they’re not able. And there are others who have the money and space to do it, but don’t necessarily have the amount of knowledge required. So if you can clone Gareth, you’re in a great place! [laughs] I think there will be opportunities to work like this again. Filmmakers will come to us and say, ‘I know what my movie is, I have so many dollars, and we don’t have to hit everything that I want, but I want to hit as many as I can – can we work together?’ As a company, we’d respond well to that.”

Lucas O. Seastrom is a writer and historian at Lucasfilm.

On February 23, 2024 The Academy of Motion Picture Arts and Sciences will recognize 16 technologies for their impact on filmmaking. Two technologies that ILM played a key role in helping to develop will be among those recognized.

SciTech Awards committee chair Barbara Ford Grant said “this year, we honor 16 technologies for their exceptional contributions to how we craft and enhance the movie experience, from the safe execution of on-set special effects to new levels of image presentation fidelity and immersive sound to open frameworks that enable artists to share their digital creations across different software and studios seamlessly.”

Former ILM engineers Christopher Horvath and Joe Ardent are being recognized alongside  Lucas Miller and Steve LaVietes for the Alembic Caching and Interchange system. Alembic began as a collaborative effort between ILM and Sony Pictures Imageworks to solve the problem of algorithms for storing and retrieving baked, time-sampled data enable high-efficiency caching across the digital production pipeline and sharing of scenes between facilities. The two companies would open-source the project and interchange library in 2011. Since then, Alembic has seen widespread adoption by major software vendors and production studios.

ILM’s Dan Bailey joins Jeff Lait, and Nick Avramoussis for the continued evolution and expansion of the feature set of OpenVDB. Core engineering developments contributed by OpenVDB’s open-source community have led to its ongoing success as an enabling platform for representing and manipulating volumetric data for natural phenomena. These additions have helped solidify OpenVDB as an industry standard that drives continued innovation in visual effects.

Unlike other Academy Awards® to be presented this year, achievements receiving Scientific and Technical Awards need not have been developed and introduced during a specified period.  Instead, the achievements must demonstrate a proven record of contributing significant value to the process of making motion pictures.

Before this announcement, 34 ILM technological achievements had been recognized with Scientific and Technical Achievement Awards. This latest recognition continues a legacy of technical innovation dating back to the mid-1970s.

Earlier today BAFTA announced the nominations for the 2024 EE BAFTA Film Awards, celebrating the very best in film over the past year. ILM contributed to four of the five films recognized with a nomination in the Special Visual Effects category. 

Jay Cooper, Charmaine Chan, Ian Comley, and Jonathan Bullock were each nominated for Gareth Edward’s sci-fi thriller The Creator while Alex Wuttke, Simone Coco, Jeff Sutherland, and Neil Corbould received nominations for Mission: Impossible – Dead Reckoning Part One. ILM also contributed effects work to Ridley Scott’s historical epic Napoleon and James Gunn’s Guardians of the Galaxy Vol. 3. 

Anna Higgs, Chair of BAFTA Film Committee said, “It has been an outstanding year for filmmaking as represented by the 38 films nominated today. They showcase ambitious, creative, and hugely impressive voices from independent British debuts to global blockbusters.  From complex moral issues through to joyful journeys of self-discovery, they all ultimately explore human connection.  Which is why we go to the cinema: to be transported into new worlds, to laugh, cry, to be entertained and to be challenged.  The films nominated today deliver all that and more – we hope people up and down the country, and around the world, are inspired to watch them. Congratulations to all the nominees.”

The winners will be announced on 18 February from the Southbank Centre’s Royal Festival Hall in London, as part of an unmissable celebration of film hosted by David Tennant.     

The EE BAFTA Film Awards will be broadcast on BBC One and iPlayer in the UK, on BritBox International in the USA, Australia, Canada, Denmark, Finland, Norway, Sweden and South Africa, as well as BBC Australia in Australia and New Zealand, NOVA Bulgaria, NOVA Greece, Turner Spain, and Canal Plus. With more territories to be confirmed.

The EE BAFTA Film Awards voting takes place over three rounds: Longlisting, Nominations, and Winners, by BAFTA’s global voting membership, comprising over 7,800 creatives and film industry practitioners

Gareth Edwards’ The Creator leads the feature film field with seven nominations six of which are for ILM work.

In all, ILM visual effects artistry was recognized with 19 nominations including those for The Creator. Indiana Jones and the Dial of Destiny, and Dungeons & Dragons: Honor Among Thieves joining it in the top category, outstanding Visual Effects in a Photoreal Feature. Napoleon and Killers of the Flower Moon were each nominated for Outstanding Supporting Visual Effects, and Ahsoka and The Mandalorian were each nominated for Outstanding Visual Effects in a Photoreal Episode. Darren Aronofsky’s Postcard from Earth received a nomination for Outstanding Visual Effects in a Special Venue Project while Willow, Indiana Jones and the Dial of Destiny, The Creator, Napoleon, Loki, and The Mandalorian, also received craft category nominations.

“We are seeing best-in-class work that elevates the art of storytelling and exemplifies the spirit of innovation. The VES Awards is the only venue that showcases and honors these outstanding artists across a wide range of disciplines, and we are extremely proud of our nominees,” said VES chair Kim Davidson.

The VES is a global honorary society dedicated to “advancing the arts, sciences and applications of visual effects and to upholding the highest standards and procedures for the visual effects profession.”

Awards will be presented at the 22nd Annual VES Awards on Feb. 21 at The Beverly Hilton Hotel in Los Angeles. 

ASIFA-Hollywood announced nominations today for its 51st Annie Awards™ recognizing the year’s best in the field of animation. The ILM team which included Rick O’Connor, Mike Beaulieu, Stewart Alves, Kevin Reuter, and Wai Kit Wan received a nomination for Best Character Animation – Live Action for its work on Lucasfilm’s hit Disney+ series, Ahsoka.

The Annie Awards™ cover 36 categories and include Best Animated Feature, Best Animated Feature-Independent, Special Productions, Sponsored Films, Short Subjects, Student Films, and Outstanding Individual Achievements, as well as the honorary Juried Awards. Created in 1972 by veteran voice talent June Foray, the Annie Awards™ have grown in scope and stature for five decades.

The awards will be presented on Saturday, February 17, 2024 at UCLA’s Royce Hall.

ASIFA-Hollywood is the world’s first and foremost professional organization dedicated to promoting the Art of Animation and celebrating the people who create it. Today, ASIFA-Hollywood, the largest chapter of the international organization ASIFA, supports a wide range of animation activities and preservation efforts through its membership. Current initiatives include the Animation Archive, Animation Aid Foundation, animated film preservation, open-source software support, special events, classes, and screenings.              

The end is only the beginning. KISS have been immortalized and reborn as avatars to rock forever. Created by Industrial Light & Magic (ILM) in collaboration with the band and Pophouse Entertainment Group, the avatars portray each of the four band members in an idealized, and at times superhuman form. Months before the supergroup’s final show which would take place on December 2, 2023, KISS joined ILM’s visual effects team at its San Francisco headquarters to get measured, scanned, and photographed before slipping into sleek motion capture suits so the crew could record every nuance of their final performance. ILM’s StageCraft virtual production team would then simultaneously capture each band member’s performance from their facial expressions to their fingertips as they played in “God Gave Rock ’N’ Roll to You II” in unison.

The KISS Avatars

The KISS avatars showcase ILM’s unique creative expertise and artistry using their advanced performance-capture technology. The team was led by Academy Award® nominated Visual Effects Supervisor Grady Cofer. Cofer has over 20 years of experience supervising groundbreaking visual effects projects. Cofer is currently nominated for an Emmy Award for Outstanding Special Visual Effects in a Season for his work on The Mandalorian. Prior, he served as Overall Visual Effects Supervisor on Space Jam: A New Legacy and earned an Academy and BAFTA nomination for his contributions to Steven Spielberg’s Ready Player One. Cofer’s three-year collaboration with Spielberg utilized cutting-edge virtual production tools to bring the OASIS, the project’s vast virtual world, to the big screen.  

“This is the sneak peek as the band crosses over from the physical world to the digital. We want to give fans a sense of the many forms this band could take in the future.”

Grady Cofer, ILM visual effects supervisor

Cofer’s ILM team leveraged the company’s decades-long experience to push the capabilities of performance capture, gathering every nuance of KISS band members’ face and body performance in exacting detail. This data would in turn become the basis for the motion of the band’s virtual avatars. The raw facial capture data was processed in real-time via ILM’s advanced machine learning algorithms for instantaneous feedback on stage and later passed through the ILM pipeline to be augmented by the artists to ensure the resulting performances were exactly as the band intended for their new digital personas enabling KISS’s creative output to continue to enthrall audiences well into the future.  

As the band’s final concert drew to a close, lead singer Paul Stanley’s avatar proudly exclaimed “KISS Army, your love, your power has made us immortal! A new KISS era starts now.” The digital group then performed its hit single “God Gave Rock ’N’ Roll to You II” to the delight of the concertgoers who filled the sold-out Madison Square Garden.

We are proud to announce Guardians of the Galaxy: Cosmic Rewind will be honored with a prestigious Thea Award for Outstanding Achievement – Attraction in 2024 by the Themed Entertainment Association (TEA). Internationally recognized, the Thea Awards acknowledge exceptional achievements in the themed entertainment industry and celebrate the creative teams who bring immersive experiences to life.

Under the guidance of Walt Disney Imagineering, Industrial Light & Magic created the immersive visuals that guests are treated to as they experience the attraction. Being Disney’s first Omnicoaster ride system, Cosmic Rewind keeps guests immersed in the action as the vehicles make controlled rotations. “It’s always exciting to push the bounds of storytelling and technology and that’s what both ILM and Imagineering are known for,” said Jeanie King, VP, Production at ILM, adding “We are thrilled to continue our amazing partnership with Imagineering that began back in the1980s and continues to flourish today.”

The filmmaker and Lucasfilm legend talks to ILM.com to reflect on what drew him to tell the story of the hit Disney+ series, “Light & Magic”.

Screenwriter and director Lawrence Kasdan.

How did you get involved with Light & Magic?
Several years ago my wife and I made a short documentary about a little diner that we used to eat at all the time that suddenly closed. In making that documentary with her, and cutting it with terrific people, it made me realize how much I liked the documentary format. I had never done that. We set out to meet some documentary people and I met Justin Wilkes at Imagine Entertainment. He asked me what I was interested in doing and I suggested a history of visual effects, because even though I had been around visual effects throughout my career, it occurred to me that I didn’t know much about them. The second thing that interested me were the people of Industrial Light & Magic that I had been working around for over forty years. So we both agreed that that would be a great story to tell: the history of visual effects, and the personal stories of these people. What drove these people, what was their life like, what made them want to stay at ILM as long as they did? Everyone loved the idea, so we went to work.

Lawrence Kasdan, center, on the set of Star Wars: The Empire Strikes Back.

What was your vision for the documentary?
From my very first film until today, I’ve always considered myself a humanist filmmaker. I’m interested in what happens between people, and why people make certain decisions in their lives. What chance is involved? What fate? What luck? So from the very beginning of this I was interested in learning what brought these people to this work. What were the relationships that they made when they arrived? Why did they continue to work there much longer than they expected, some for nearly half a century? What has all that meant to these amazing advancements in technology? It’s about people, and their gifts, and out of those gifts came technological advancements that boggle the mind.

Dennis Muren, left, and Phil Tippett, right, review images with Joe Johnston.

Why did you think this story should be told?
Because it’s great to see artists at work. The commitment of great craftsmen. I love to see people that have mastered a skill, and try to make it better, and don’t settle. I think it’s great to see expertise and this pure devotion to discipline, and that is always a good story to see. Dennis Muren, left, and Phil Tippett, right, review images with Joe Johnston.

John Dykstra and a fleet of miniature TIE, X-wing, and Y-wing starfighters.

How did you approach the research, and what resources did you use?
We had a fabulous team that Imagine Documentaries put together, some internal to the company, and some that were freelancers. They really knew their stuff, so it was a great luxury for me as a director. There were so many things that I wanted to ask during interviews, but the input from this incredible group of producers and writers and editors stimulated me all of the time to go in different directions during interviews.

ILM’s Paul Huston and Larry Tan on the set of Star Wars: Return of the Jedi.

For those that have yet to watch it, can you tell readers what the timeline of the series is?
Over the six hours we see the very birth of ILM, what happened as it came together during the production of Star Wars: A New Hope, and then off of the success of that film, how it was launched into a nearly fifty-year enterprise. We mainly follow it chronologically, but we do jump around a bit to serve the story. Part of the kick for me was that we had such a trove of archival footage, so these people might be talking about something from forty or fifty years ago, and we had stills from that moment in their career. It was incredible to be able to cut from one to the other across time, to hear them talking about a problem, and then see footage of them finding a solution. A huge part of ILM’s legacy is finding solutions to problems.

Peter Kuran, Rose Duignan, and George Lucas review effects shots for Star Wars: A New Hope.

How did you select the filmmakers that were featured in the documentary?
They are all giants, and they have all used ILM in the most expressive and innovative ways. They put pressure on themselves and then turned to ILM and said, “can you do this? Can you create something for me that I have never seen before?” ILM would always say yes. And sometimes it might be a struggle, and sometimes it might be a long process, and sometimes it might be an instantaneous solution where one of these genius people that work there would say, “I know what we could do”. These are major filmmakers that have contributed to the zeitgeist. Jim Cameron, Steven Spielberg, Bob Zemeckis, J.J. Abrams, and at the heart of it, of course, is George Lucas.

Lawrence Kasdan and J.J. Abrams on the set of Star Wars: The Force Awakens.

What was the most interesting thing you learned throughout the process of creating Light & Magic?
I think I learned what goes into creating something new, working with people you respect and depend on, and how this personal relationship then impacts the professional work. There is something beautiful about the generosity of the people that work at ILM, and through that generosity they are able to discover new frontiers and break new grounds that no one has ever been able to do.

All episodes of Light & Magic are streaming now on Disney+.

ILM | A legacy of innovative and iconic storytelling.

“I wanted to make sure we brought the magic back to the ILM logo,” noted John Knoll, Industrial Light & Magic’s Executive Creative Director, participating in one of the dozens of interviews completed over the course of the fourteen-month rebranding project for the renowned visual effects and animation studio founded by George Lucas. 

“We wanted our new branding to pair closely with ILM’s mission statement, We are visual storytellers who create iconic moments to inspire the imagination”, explained Janet Lewin, SVP General Manager, ILM, “At ILM, we prioritize our culture of collaboration and community and we truly value innovation and quality. These core ideals allow us to confidently take risks and embrace the unknown on the challenging projects we seek out.”

Knoll was one of over a thousand ILM employees who provided input to twin sisters Amy and Jen Hood who own the Southern California brand identity and type design studio, Hoodzpah. As part of their exploration of ILM, its employees, its legacy, and its values, the interviews revealed fascinating insights into the company and how it has managed to keep both its creative team inspired and its technology on the cutting edge of innovation for nearly five decades. Knolls’ sentiment struck a chord and it became an oft-referred phrase as the Hoodzpah team collaborated with a core group of ILM leaders on how best to capture the company’s incredible legacy while building a unique identity system that would serve it well into the future.

“ILM has several different logos in the past forty-eight years,” explained Rob Bredow, SVP and Chief Creative Officer for ILM, “and all have incorporated the core elements from the company’s original logo, the famous wand-wielding magician framed by a large gear with the letters ‘ILM’ originally illustrated by Michael Pangrazio in the late 1970s and later finalized in a painting by renowned artist, Drew Struzan. The company and the industry have evolved substantially in the past eighteen years and we felt the time was right to develop a new brand identity that captured the global studio we’ve become.”

Now with six global studios—San Francisco, Singapore, Vancouver, London, Sydney, and Mumbai—ILM has not only revolutionized the field of visual effects with groundbreaking innovations in digital effects, performance capture, previsualization, and digital humans, and has most recently innovated in areas as diverse as real-time rendering, immersive entertainment, and virtual production with it’s Emmy Award-winning StageCraft platform.

Hoodzpah began the assignment by getting to know the company through individual and group interviews with key members of ILM’s leadership team and representatives of the employee base across all strata of the studio and each of ILM’s globe locations. Then came the task of distilling the information into key learnings. “It was remarkable given the sheer number of people we interviewed that there was such cohesion in terms of what the employees felt the brand represented and where they aspired to be,” said Jen Hood.

The new dynamic glyph and custom wordmark combine to draw from the company’s illustrious legacy while carrying it into the future. Amy Hood, explained, “The mark utilizes negative space within the silhouette of a gear giving the impression of a lightbulb contained within, both elements existed in the company’s original logo by Pangrazio and Struzan. We incorporated a swoosh trailing a spark of magic in the new mark which represents the global nature of ILM’s talent base and studios. Paired with the mark is the Industrial Light & Magic wordmark designed and set in a bold customized serif face evocative of the abbreviated type in the company’s original logo.” The team also developed updated logos for sub-brands ILM Art, ILM StageCraft, ILM Technoprops, and ILM Immersive (formerly ILMxLAB). The supporting visual identity uses cinematic colors inspired by ILM projects over the years, as well as bold type, and stark minimal layouts. The rebrand scope spanned deck templates, social media assets, a new homepage redesign, logo animations, swag, and more.

Now in its 48th year of existence, ILM continues to be a creative partner to storytellers and filmmakers alike. The talented artists, technicians, and production teams ensure that the company remains on the cutting edge as they continue to develop new techniques and technologies that allow audiences the world over to be immersed in the visuals and experiences the company helps to create.

Everyone at Industrial Light & Magic (ILM), has been honored to call Jean Bolte a colleague, and after 35 years she’s now leaving to pursue her own work as an artist.

Jean first arrived here in 1987 when she joined the hallowed Model Shop at ILM’s former Kerner facility in San Rafael, California. With prior experience working on puppets, models, costumes, and make-up, she was soon contributing to the fantasy adventure Willow (1988). Jean helped construct the animal puppets used in the fabled transformation sequence, where ILM employed one of its first groundbreaking computer graphics techniques, the “morph.”

After advancing to the role of project supervisor in the Model Shop, Jean was among the traditional artists to make the transition to computer graphics work in the 1990s. She became one of the principal users of ILM’s in-house digital painting software, “Viewpaint,” and ever since Jean’s input has helped inform the research and development of new toolsets.

Over the decades, Jean has played an instrumental role in everything from the creation of an all-digital Yoda for Star Wars: Attack of the Clones (2002) to the innovative, de-aging techniques in The Irishman (2019). All the while, she’s continued as a proponent of integrating traditional methods of art into her work and celebrated the value of taking inspiration from the natural world.

Jean’s passion has made her both a mentor to her colleagues and an advocate in the wider visual effects industry. In particular, she has spoken up for the continued growth of women in the field. She won a Visual Effects Society Award for her work on Deepwater Horizon (2016) along with earning many other nominations, and she has been an active member of the Visual Effects Branch of the Academy of Motion Picture Arts and Sciences.

We join the entire industry in celebrating Jean Bolte’s career. Her legacy as a friend, mentor, collaborator, and leader will continue to influence the company in countless ways. Congratulations, Jean!

Today, teams across Industrial Light & Magic (ILM) and Lucasfilm mourn the loss of our former colleague Richard Miller, who recently passed away at the age of 80.

Before he was hired at ILM in 1981, Miller worked as a freelance sculptor and jeweler, developing a unique style of sculpting characters that fused modern flair and classical elegance. Yet another in a succession of Long Beach State University alumni to join ILM, his first task was to sculpt an elaborate metal bikini worn by Princess Leia in Star Wars: Return of the Jedi (1983). The almost inflexible piece had to carefully fit actress Carrie Fisher. That unique assignment grew into an ILM career spanning nearly 30 years.

“I worked with Richard on a great many projects over the years,” says visual effects supervisor and executive creative director John Knoll, “and his warm and gentle disposition combined with his terrific talent and artistry made him always a joy to work with. I’ll miss that easy smile.”

Miller found his place in the company’s hallowed model shop, where some of the world’s most devoted artisans plied their craft on hundreds of visual effects projects. As sculptor, he contributed to dozens of films, helping make countless figures and characters.

Just some examples are serene whales in Star Trek IV: The Voyage Home (1986), the Statue of Liberty for Ghostbusters 2 (1989), the iconic helmet in The Rocketeer (1990), a rhinoceros and elephant in Jumanji (1995), and Davy Jones in Pirates of the Caribbean: Dead Man’s Chest (2006). Among many Star Wars creations, Miller sculpted towering statues for the Jedi Temple and a frieze depicting an ancient battle between the Jedi and Sith visible in Chancellor Palpatine’s office.

“Richard was the ultimate collaborator,” says creative director David Nakabayashi. “Every day on AI: Artificial Intelligence he would say, ‘What are we doing today!’ You would have very little to say as his work spoke for itself. He always made things better than you imagined. He was a true artist and loved to teach others. His workshops were always full of students who loved to learn from him and hear his stories.”

As revered as he was for his artistry, Miller was also beloved as a teacher and mentor. For years he led workshops and classes at ILM, sharing his knowledge and experience with new generations of artists. A number of Miller’s works remain in ILM’s collection, and have been admired by countless visitors to Letterman Digital Arts Center.

With an ILM career paralleling changes in visual effects as practical techniques evolved into digital ones, Miller is a shining example of how timeless artistic principles remain at the core of any artist’s work, no matter the tools or the medium.

“Richard was one of my closest coworkers, my teacher and an endearing and unique part of the model shop family,” says texture supervisor Jean Bolte. “He taught me a lot, about sculpting, about living well, and occasionally about patience, as good friends do. Farewell Richard. I’ll miss you.”

New Mumbai studio to provide full visual effects and animation services for film & television

Cassian Andor (Diego Luna) in Lucasfilm’s ANDOR, exclusively on Disney+. ©2022 Lucasfilm Ltd. & TM. All Rights Reserved.

Industrial Light & Magic (ILM), the award-winning visual effects division of Lucasfilm Ltd., announced today that the company is expanding its global operations. The studio, which is headquartered in San Francisco and has existing studios in Vancouver, London, Singapore and Sydney, will open a new full pipeline studio in Mumbai to gain access to the incredible talent base in the region. The Mumbai studio will be led by Kiran ‘KP’ Prasad, who was formerly head of studio at DNEG Bangalore & Chennai. Prasad will report to ILM SVP and General Manager, Janet Lewin.

Kiran Prasad, Executive in charge, Mumbai studio.

“With five global studios consistently operating at capacity and continuing to grow, the time was right for ILM to expand once again to meet the industry’s increasing demand for high-caliber visual effects,” explained Lewin. “This new full-fledged visual effects studio in India will allow us to offer even greater capacity while ensuring that we always meet the high-quality bar that our clients expect of us.” 

Rob Bredow, SVP and chief creative officer of ILM noted, “We’re excited to be building our ILM Studio in India where we can recruit the top artistic and technical talent from the visual effects industry now in India. This is the perfect time for ILM to form our sixth studio where artists will leverage our full pipeline of disciplines working across a wide variety of exciting shows – at the top quality and reliability our creative partners have come to expect from ILM.”

“ILM has always been at the forefront of technological and creative innovation in the visual effects industry and there is no better time to start our studio in India than now, as the Indian VFX industry is poised for spectacular growth in the coming years,” said Prasad. “It is exciting and an honor to be part of the ILM team at such a key moment in the VFX industry with technological developments pushing the boundaries of visual storytelling. I look forward to working with the executive team to set up the studio from the ground up, building a world-class facility, and bringing the best of the diverse Indian talent together for  an opportunity with endless possibilities.”

ILM’s last expansion effort came in 2019 with the company’s Sydney studio. That studio is currently 400 people strong and growing. Combined, the ILM global studios will grow to over 2,500 artists and will continue to offer award-winning visual effects and animation as well as concept design and development, and virtual production, with the artistry, innovation, and creative problem-solving that is the hallmark of the company. As with the other studios, ILM’s Mumbai studio will work on all projects of all shapes and sizes, including live-action and animated feature films, television, streaming, and themed attractions.

ILM will be hiring leadership, technology, support, production, and artist roles over the coming months, openings will be posted on https://www.ilm.com/careers.

The Television Academy announced its winners for the 74th Annual Primetime Creative Arts Emmy® Awards over the weekend, celebrating today’s talent and their groundbreaking work. ILM’s creative teams were honored with an award for Outstanding Special Visual Effects in a Season or a Movie for The Book of Boba Fett, alongside nominations for their work on The Witcher. This is third win for a Lucasfilm series in this category, a testament to the cutting edge work that ILM is known for.

Six-Part Docuseries Debuts Exclusively on Disney+ July 27

Disney+ released the trailer and key art for Lucasfilm and Imagine Documentaries’ “Light & Magic,” an immersive series that chronicles the untold history of world-renDisney+ released the trailer and key art for Lucasfilm and Imagine Documentaries’ “Light & Magic,” an immersive series that chronicles the untold history of world-renowned Industrial Light & Magic (ILM), the special visual effects, animation and virtual production division of Lucasfilm.

Granted unparalleled access, Academy Award®-nominated filmmaker Lawrence Kasdan takes viewers on an adventure behind the curtain of Industrial Light & Magic. Learn about the pioneers of modern filmmaking as we go on a journey to bring George Lucas’ vision to life. These filmmakers would then go on to inspire the entire industry of visual effects.  

The series is directed by Lawrence Kasdan, and the executive producers are Ron Howard, Brian Grazer, Justin Wilkes, Lawrence Kasdan, Kathleen Kennedy and Michelle Rejwan. 

All six episodes of “Light & Magic” premiere on July 27, exclusively on Disney+.

Twitter@DisneyPlus@ILMVFX
Instagram@DisneyPlus@ILMVFX
Facebook@DisneyPlus@ILMVFX
TikTok@DisneyPlus
Hashtag: #DisneyPlus

ABOUT DISNEY+

Disney+ is the dedicated streaming home for movies and shows from Disney, Pixar, Marvel, Star Wars, and National Geographic, along with The Simpsons and much more. In select international markets, it also includes the new general entertainment content brand, Star. The flagship direct-to-consumer streaming service from The Walt Disney Company, Disney+ is part of the Disney Media & Entertainment Distribution segment. The service offers commercial-free streaming alongside an ever-growing collection of exclusive originals, including feature-length films, documentaries, live-action and animated series, and short-form content. With unprecedented access to Disney’s long history of incredible film and television entertainment, Disney+ is also the exclusive streaming home for the newest releases from The Walt Disney Studios. Disney+ is available as a standalone streaming service or as part of The Disney Bundle that gives subscribers access to Disney+, Hulu, and ESPN+. For more, visit disneyplus.com, or find the Disney+ app on most mobile and connected TV devices.

MEDIA CONTACTS

Disney+ Media Relations
Shelby Cotten
Shelby.b.cotten@disney.com

Walt Disney Studios Global Publicity
Global Publicity (NY)
Derek Del Rossi        
derek.del.rossi@disney.com

Lucasfilm Publicity
Ian Kintzle 
ikintzle@ilm.com

San Francisco and Vancouver–Production is underway in Vancouver on the ambitious upcoming
Disney+ Original series Percy Jackson and the Olympians, based on Rick Riordan’s best-selling
novels, on a newly built state of the art Stage Craft LED stage, the first of its kind in Canada. The
stage was built through a partnership with Industrial Light & Magic (ILM) and 20th  Television
which is producing the eagerly anticipated Disney Branded Television series for Disney+.

Explained by executive producer and author Rick Riordan, “The story of Percy Jackson has
such an epic scope, I was crossing my fingers we would be able to partner with Industrial Light
& Magic. “That was really the only way to do the adaptation justice and bring our visions to life.
I am over the moon that we have forged such a great relationship to give this show such a
cutting-edge look and feel. I’m sure the Olympian gods would expect nothing less!”
 
“The 20th Television team and the series producers clearly saw the value that ILM StageCraft
brings to a production and understood it to be a perfect fit for a series like Percy,” said Chris
Bannister, executive producer, ILM StageCraft. Jeff White, creative director for ILM’s Vancouver
studio, added, “With ILM’s StageCraft technology we allow filmmakers to design, light, and
shoot the digital world as they would in the practical world all integrated in front of the cast and
crew on stage. It opens up an amazing range of possibilities right before their eyes.”
 
“Working with the team at ILM has been a dream,” said 20th executive vice president of
Production Nissa Diederich. “The fans of this franchise have high expectations for the series
and we knew that we needed the most advanced production technology available, and who
better to partner with than Industrial Light & Magic? The stage we have built will be home
to Percy and potentially dozens more of our most ambitious series. It really says to our creators,
the sky’s the limit – if you can dream it, we can shoot it.”
 
Based on Disney Hyperion’s best-selling book series by award-winning author Rick Riordan,
“Percy Jackson and the Olympians” tells the fantastical story of a 12-year-old modern demigod,
Percy Jackson, who’s just coming to terms with his newfound divine powers when the sky god
Zeus accuses him of stealing his master lightning bolt. With help from his friends Grover and
Annabeth, Percy must embark on an adventure of a lifetime to find it and restore order to
Olympus. 
 
The series will star Walker Scobell as Percy Jackson, Aryan Simhadri as Grover Underwood
and Leah Sava Jeffries as Annabeth Chase. Previously announced guest stars include Virginia

Kull as Sally Jackson, Glynn Turnman as Chiron aka Mr. Brunner, Jason Mantzoukas as
Dionysus aka Mr. D, Tim Sharp as Gab Ugliano and Megan Mullaly as Alecto aka Mrs. Dodds. 
 
Riordan and Jon Steinberg serve as writers of the pilot, and James Bobin directs. Steinberg
oversees the series with his producing partner Dan Shotz. Steinberg and Shotz also serve as
executive producers alongside Bobin, Riordan, Rebecca Riordan, Bert Salke, Monica Owusu-
Breen, Jim Rowe, Anders Engström, Jet Wilkinson and The Gotham Group’s Ellen Goldsmith-
Vein, Jeremy Bell and D.J. Goldberg.

Granted unparalleled access, Academy Award®-nominated filmmaker Lawrence Kasdan takes viewers on an adventure behind the curtains of Industrial Light & Magic, the special visual effects, animation and virtual production division of Lucasfilm. Learn what inspired some of the most legendary filmmakers in Hollywood history, and follow their stories from their earliest personal films to bringing George Lucas’ vision to life. From Imagine Documentaries and Lucasfilm, and executive produced by Brian Grazer and Ron Howard, the six-part documentary series premieres exclusively on Disney+ July 27.

Phil Tippett puts the finishing touches on the Rancor from Star Wars: Episode VI – Return of the Jedi.

On Friday, May 27, attendees of Star Wars Celebration will be among the first in the world to get a sneak peek at “Light & Magic” with an “illuminating”  discussion panel featuring Lawrence Kasdan and Ron Howard, joined by VFX titans Dennis Muren, Phil Tippett, Joe Johnston and Rose Duignan, and Lynwen Brennan, Lucasfilm executive vice president and general manager.

Join employees across Industrial Light & Magic, Lucasfilm, ILMxLAB, and Skywalker Sound as they share stories on discovering their passions, beginning their careers, and the challenges and satisfactions of working in their current roles and departments. We hope their personal stories and recommendations inspire the next generation of young artists to break into and make their impact on the entertainment industry.

To learn more about Get in the Door, visit GetInTheDoorProject.com and watch the trailer below.

Gareth Edwards on the set of Rogue One: A Star Wars Story.

Join Gareth Edwards and the Publicity Group at Industrial Light & Magic as we look back at his time directing Rogue One: A Star Wars Story. Gareth discussed the cutting-edge virtual production used for the film, and the ways in which George Lucas inspired him as a filmmaker.

Tell me about the freedom you found in the virtual production-aspects of Rogue One?
John Knoll was very crucial for this, because he and the team at ILM devised a virtual environment where we could go in and look for shots. My entryway into filmmaking was through visual effects, so I understand it a bit, but a lot of VFX is kind of dark arts, which causes clients to come to visual effects companies and see VFX as magic, because no one understands what they do. The downside to that is that they can ask for things or approach scenarios in such a way that is really back-to-front, and doesn’t produce the best result. I find that storyboarding shots is really useful, but at a certain point it becomes somewhat limiting, because you’re having to invent every single detail about that shot. Whereis, in the real world, what you tend to do is you have a space, because it already exists. The light hits objects in this space a certain way, and going in, you knew you’d do a close up of someone’s face, but if you were to have them look down a little bit, and maybe move to the right, suddenly you have this beautiful composition that you wouldn’t have found with storyboarding. The trick with VFX is having that opportunity, and going, “this was the plan, but now that the ingredients are here in front of us, doing this would actually be better.” So figuring out a seamless way to do that without it being painful for the artists is important. There’s lots of ways to achieve that, but when you’re in space with spaceships, the only real way to do it—unless you’re doing what George did, which was taking footage of WWII aerial combat that would represent the final shot—is what John Knoll and Industrial Light & Magic were pushing for. 

John Knoll and Alan Tudyk, in his mocap costume, on the set of Rogue One: A Star Wars Story.

And what was John pushing for?
It was pre-viz animation of each section of the battle sequence. They then figured out a set up where they had an Apple iPad, with a game controller attached to it. When you moved the iPad, it could tell where it was in 3D space. They would then just loop these twenty or thirty-second chunks of animation, and I would get to hang out in these spots and just film it again and again, generating hours of footage. Then I’d go home on my MacBook and select my favorite takes, and then try to cut something together. It would be very jittery, and handheld, and not perfect. For each one we’d smooth it out by filming from another spaceship, and for another we’d keep some of that handheld-look. It felt like the process of getting those virtual shots was how we were getting the live action shots, which was, “light a space and find the shot,” versus, “tell us the shot, and we’ll invent all the pieces to create it.” It always feels more real with the first approach.

The Death Star’s Mk I Superlaser is set into place.

Like that shot of the Star Destroyer emerging from the shadow of the Death Star.
Exactly. That was a great example of John Knoll and ILM pushing for this new technology. If I remember correctly, we needed a shot of the Death Star for the trailer, but they needed it in only a few days. I remember that John got the iPad, and set up a model of the Death Star and some Star Destroyers. The important thing when devising a shot like that, is that the idea of scale is only relevant to something bigger in the shot. A typical thing you do in matte paintings, is when something needs to look really big, you paint a little human in there. It’s a trick that they used to great effect in The Empires Strikes Back. When you want something to feel big, you need to set something up to feel really big, and then show a new thing that’s even bigger than that. The idea was to have a ship that you know the scale of, like a TIE fighter, and then reveal the Star Destroyer, which feels huge, and then you reveal the Death Star which feels impossibly massive. I remember asking them, “can you do real-time shadows on this?” Once I learned that that was possible, it became so fun to reveal and conceal the ships in shadow, and find that moment where the dish slides into place. Within a few hours, we had the shots that went into the trailer, and that never would have been possible without the real-time technology that ILM was using. 

An Imperial I-class Star Destroyer emerges from the shadow of the Death Star’s Mk I Superlaser.

Since Rogue One, you’ve gotten to visit ILM’s StageCraft volume in person. Having that experience, did it make you think about how you may have captured any shots differently?
I think that’s always true of technological advances in filmmaking, so yes. For sure. It feels like filmmaking in general is an archaic process. It’s over a century old, and in some ways, it hasn’t changed hardly at all – and yet this digital revolution, which is happening all around us, should drastically change the ways we make movies, but it’s been a slow process. There is so much we could do to utilize the technology we have to be more creative, and allow us to do things we couldn’t have done before. StageCraft though is a massive leap. It’s game-changing. It’s moving the industry forward.

ILM’s StageCraft in use on The Mandalorian Season Two.

What do you think will stay the same?
Storytelling, regardless of the medium. From the time of early humans, a million years ago until today, we have an innate need to sit around a campfire and listen to a story. Whether it’s a story about something interesting that happened that day, theorizing about why the world is the way it is. It’s absolutely hardwired into us. There’s this little glowing light. The need to paint a picture or sing a song about another person or another place. I don’t believe that that is going away anytime soon. We’ll always have an appetite for storytelling. Strangely, what I find funny is that a movie is around two hours long, and that’s about the length of time it takes for a campfire to burn out. It’s so embedded in us. That’s what cinema is, it’s us being able to dream out loud, or watch another person’s dream in real-time. I hope that stays with us. I hope in five-hundred years, people are still watching Star Wars.

The DS-1 Orbital Battle Station prepares to fire on Jedha City.

In getting into the making of film and working with Lucasfilm and Industrial Light & Magic, were there ever moments where you needed to pinch yourself, because you were given the ability to create nearly anything you could dream up?
Funny enough, it almost felt like we had too much power, so we needed to be careful that we limit ourselves so it felt like the original trilogy. One of the early things that we did with John Knoll and ILM was the kitbashing, using the original models just like the original films. We went and bought some of the original model kits, lots of WWII, vintage-collector stuff. We started scanning those model parts in so that we could stick them onto the models we were building. What’s interesting is the subsurface scattering that goes on with those model pieces. It’s not like metal. So we were trying to recreate that model kit feeling on our ships. What’s funny though, when you get into that scenario, you realize that your memory is a little bit better than reality with some of the sets and props. The golden rule became, “let’s not do it how it was, let’s do it how we remember.” We wanted it to look and feel like how you “thought” those models looked. Also, going through the Lucasfilm Archives, the models are everywhere, and there were some designs in there that looked really cool. Those were good keys for us, because we were making something before A New Hope, but what did that mean, stylistically? Doug Chiang had a really hard task before all of us while doing Star Wars: Episode I – The Phantom Menace. They took a real leap on that film from both a timeline and stylistic standpoint. It was much more Art Deco. The streamlined nature of the ships felt much more like The Rocketeer and Flash Gordon; those types of serials that inspired George Lucas to make Star Wars in the first place. And that makes sense because it takes place over 30 years before A New Hope. But for us, aesthetically, it was just before A New Hope, yet we still wanted it to feel distinct. What we did for inspiration, is we looked at the Ralph McQuarrie artwork, and the early models that had been made for A New Hope; things that had been either abandoned or improved. We sort of “reversed the car” back into that space, and where they went left, we went right. One example is that really slender, aspirational Stormtrooper that Ralph painted. He made that without a care in the world about how it would be actually realized by the costume department. We kept pointing at that, because traditionally when you put someone in armor, it can start to feel a bit bulky. We wanted something that looks like it could sprint and cause some serious damage. We tried to make armor that was slightly bendable, so it could sit just over top of the skin. We tried to cast towards that look of the troopers, someone tall and lanky. What that eventually became was the Death Trooper. That’s where the Death Troopers started. To answer your original question, we tried not to be kids in a candy store, we tried to temper that and work off the design language that existed. “With all of those limitations, what would they have done?” 

The UT-60D U-wing, ‘LMTR-20’, heads for Eadu during Operation Fracture.

Did you and Greig Fraser try to match some of the old-school camera moves seen in the original trilogy?
We did. We tried to keep the vocabulary of any shots featuring Krennic and the Empire the same as those old school cinematographers. The way when someone walks by, how they would push in again to recompose based on the new position of the actors. All of these little things that would happen that were common in the late 1970s, things we don’t do so much now. 

Darth Vader (voiced by James Earl Jones) from a scene in the trailer for Rogue One: A Star Wars Story.

This year being the 50th anniversary of Lucasfilm, I wanted to know what George Lucas means to you?
A lot of things come to mind. As crazy as it sounds, I think he’s underrated. I know that sounds crazy with all of his accolades. When I was a little kid, I didn’t ever really watch THX 1138 or American Graffiti. But as I got older, I would revisit those films constantly. THX is an incredible debut. It’s just an absolutely fantastic film, and one of the strongest films that came out of a first-time filmmaker in all of cinema. In terms of how bold and completely brand new it was. So many things made their way to Star Wars too. That fuzzy comms chatter. The clinical corridors. The look and feel. There was no Ralph McQuarrie, but it felt so much like George. He has such a great aesthetic and an amazing eye. It took a lot for him to make that film. Then he goes and makes the films that have inspired you, and me, and everyone like us. Even if that was all he ever did, it would have been enough. But then he goes and pushes harder, and pushes the digital technology further. Because of him, I was able to make my first film on a digital camera. I wouldn’t have been able to make my first film if it required film stock. I wouldn’t have been able to make my first film without George. Him pushing HD, and all the work he did with the technology used for the prequels, and the digital camera technology. I got into digital effects because of that, and I wouldn’t have been able to if it wasn’t for George building up everything at Industrial Light & Magic. He inspired me to want to become a filmmaker, and he gave me the tools to do it. At the end of that journey, I got to make a Star Wars film. He gave me Rogue One.

George Lucas on the set of Star Wars: Episode IV – A New Hope filming a scene aboard the ‘Tantive IV’.

Have you gotten to spend time with him?
A handful of times. This next story I say with the utmost love and admiration for the entirety of the Star Wars catalogue. But when I had my office at Pinewood, I was putting a lot of pressure on myself to make this film. Everyone had The Empire Strikes Back, and A New Hope posters in their offices. To help elevate that, and to remind myself that this was Star Wars, and that I was making a Star Wars spinoff film, and I needed to have fun, I put up framed prints from The Holiday Special and Caravan of Courage: An Ewok Adventure. Those were the first spinoff films. [laughs]. Well, one day, George came to Pinewood, and he was sweet enough to come up to my office. I worked really hard to distract him while we spoke so he wouldn’t see the posters. I was really animated, and tried to lead him through the back. It was like a comedy skit as I tried to keep him away from the posters. I didn’t want him to get the wrong idea about Rogue One. [laughs]. We got to hang out for a few hours that day, and I got to tour him around. I got to spend time with my hero. It was a surreal experience. He’s the Paul McCartney of film. 

Gareth Edwards on location for Rogue One: A Star Wars Story. Photo courtesy of Greig Fraser. All Rights Reserved.

And you got to spend time at Skywalker Ranch, yeah?
Yes. When I was there, the projectionist was so sweet. They said, “ when we would project the Star Wars reels, George would sit right there.” And they pointed over at a seat. “Would you like to sit there?” I got to sit there throughout the sound mix on Rogue One. It felt like I was sitting on the throne of the film world. The funny thing is, if you’re so intimidated by it, it can paralyze you. You have to let that fall away. But let me tell you, that was the best job in the world. That beautiful drive through the trees and hills on the way to Skywalker Ranch. Past Lake Ewok. It was so utopian. We were making a Star Wars movie. It was everything I’ve ever dreamed of. It’s surreal to think it even happened to begin with. You dream about this stuff as a kid, but it shouldn’t actually happen. What’s funny is, when it comes to Industrial Light & Magic, and Lucasfilm, and the team at Skywalker Sound, you see it in everyone that works there. We all have the same story. You and me, we grew up with the same story. The trinkets on your desk are the same ones I have at home. Those Ralph McQuarrie prints behind you. I feel like we all have a lot in common. I feel like if I was going to hang out with people outside of work, it would be with the people at ILM. Everyone is a mini-filmmaker, and even though we grew up in different places all around the world, if we went to the same school as kids, we’d be mates – and then suddenly all of these people wound up at Skywalker Ranch & Industrial Light & Magic. When Covid is through, I hope everyone can come together and see each other again. 

Gareth Edwards on location for Rogue One: A Star Wars Story. Photo courtesy of Greig Fraser. All Rights Reserved.

I love that. Last question. John Knoll and Hal Hickel wanted me to ask you about Area 51?
[Laughs]. So I was in Las Vegas watching John Knoll, Hal Hickel, and Matthew Wood, from Skywalker Sound, during a panel at NAB for Star Wars. After it was through, I told them all, “we are only a few hours away from Area 51. We will never get this chance again.” [Laughs]. We drove several hours in the dead of night, through Rachel, Nevada, and walked right up to that fence where you couldn’t go any further. We went to  Area 51. We stayed just long enough to scare ourselves, and then we got out of there.” [Laughs].

Gareth Edwards on location for Rogue One: A Star Wars Story. Photo courtesy of Greig Fraser. All Rights Reserved.
Greig Fraser on the set of Rogue One: A Star Wars Story. All Rights Reserved.

The cinematographer for Denis Villeneuve’s Dune, and Matt Reeve’s The Batman, joins Industrial Light & Magic’s Publicity Group to discuss his work on Rogue One: A Star Wars Story. Greig shares how the early Kenner action figures inspired his love of Star Wars, and the influences he found in 1970s cinema, the works of Andrei Tarkovsky, and the film The French Connection.

What was your introduction to Star Wars?
If I think back about how I was first introduced to Star Wars, I think it had to be through the toys. I genuinely think it was the toys that got me going there. I was two years old when Star Wars came out, and five when The Empire Strikes Back premiered. You couldn’t really call me a “film fan” at that point, but the franchise definitely existed in my universe. I read some of the comics later on, but the thing I loved the most back then were the toys. A few years after, I think ‘82, Star Wars came to Betamax and VHS, and then the year after that, in 1983, I finally saw Return of the Jedi in theaters. It was mind-blowing, because the visual effects that ILM did for it were so revolutionary and groundbreaking. Then over the course of the next ten or fifteen years, I think I watched A New Hope, The Empire Strikes Back, and Return of the Jedi literally hundreds of times. 

A selection of Star Wars Kenner action figures available in the early 1980s.

How did the experience of watching the original trilogy influence your work on Rogue One?
The funny thing is, when it comes to Star Wars, there is a very particular visual language with the way the films are made. From the way they climb aboard the Millennium Falcon, to the wide shots of the Millennium Falcon going past the camera. There is a visual language that exists that, unless you’re studying it, you don’t really notice it. That occurred to me when we started Rogue One, when Gareth basically told me, “we’re not remaking Star Wars. We’ll make this movie the way we would want to make this movie.” But the thing is, what was great about that, is that we could channel Star Wars. Normally you try to hide your influences; you don’t wear them on your sleeve when you make a movie. You try to become a little more nuanced, a little more “clever” about sort of fooling people into what your influences are. “No, I didn’t actually watch Steven Spielberg films to make this ‘Spielbergian’ movie.” Those sorts of things. But what was great about Rogue One is that we were making a film that actually connected directly into Star Wars: Episode IV – A New Hope, by design. So if we wanted to reference anything from Episode IV, Episode V, or Episode VI, we could. We were actively encouraging ourselves to do it. For me that was a huge revelation, because normally, on any other film, you wouldn’t do that. For example, when we went back and watched Obi-Wan’s sequences aboard the Death Star, we would study how Sir Alec Guinness would move throughout the corridors, and it was very influential in the way that we did some of our movement through the Imperial security complex on Scarif. We took for granted that it was such a big place, and that the Imperials would be minding their own business and doing their own thing, and that you could have these Rebel spies, and have them actively infiltrate this heavily-fortified complex.

Obi-Wan Kenobi uses a Jedi trick to distract a pair of TK Stormtroopers aboard the DS-1 Orbital Battle Station.

Was there a lot of conversations around trying to match the aesthetic of A New Hope?
There was. Growing up, you got used to watching Star Wars on Betamax and VHS, on a home television format. For research for this film, I was able to watch a 4K scan of one of the earlier films, and the conversation turned to, “is that our North Star? Do we make it look exactly like that? Do we shoot it on film, with those same lenses?” Sometimes your memory of something can be slightly different from reality, so what we did for Rogue One, is we tried to match it to the aesthetic of our “mind’s eye”, and what we remember from Star Wars growing up. For us, thinking about that look – it wasn’t super sharp, but it had depth and clarity. It was soft at times, but not defunct. That is why we chose the format that we did, the ARRI ALEXA 65, paired with these old lenses. For Gareth and I, it felt like it was showing us the film that we remembered as kids.

Director Krennic is confronted by Darth Vader at Fortress Vader on Mustafar.

Did you find other advantages to shooting digital? Was there ever a conversation of shooting it on film?
There were a number of factors. The look we were trying to achieve was one factor, but the other thing that we had to balance towards was the fact that Gareth Edwards is a very hands-on filmmaker. He loves to operate the camera. Watch his film Monsters, which, coincidentally, was the whole reason I wanted to meet Gareth in the first place. When I was called up to do the interview for Rogue One—and of course, I was so excited for the opportunity—I thought, “even if I don’t get this job, I will get to meet the guy that made Monsters. I’ll get to shake his hand, and I’ll get to tell him about the mad respect I have for him and his film.” So when he explained to me that he wanted to make Rogue One with the same spirit that he used to make Monsters, I got really excited. That decision was also part of the reason we chose the ALEXA 65. It had all the film qualities of a much bigger camera, but it was in this bitesize package that you could throw around, and put in cockpits, without having to destroy too many things to get the shot you needed. It was a series of factors, but it all worked in our favor.

A shot from Gareth Edwards’ film, MONSTERS. Photo courtesy of Magnet.

Gareth has a unique style of shooting, where he’ll go from one take to the next without slating. How did your style integrate with that?
I found it very exciting. In some ways, even though Gareth was my director, he was also my camera operator. I loved helping him build a world where he could achieve anything that he wanted to achieve; be that handheld shots, or very specific tracking shots. That’s what I loved about Rogue One, and how Gareth wanted to make it. There were considerations, of course, but there were moments of freedom – both in freedom of movement, and freedom of camera. It kept everyone on their toes. He would pick up these small moments, maybe something an actor was doing, and he would get the camera in there and capture it. 

Gareth Edwards shoots a scene of Jyn Erso (Felicity Jones) on the set of Rogue One: A Star Wars Story.

Greig, your photography has such a distinct style. What influences did you pull from in designing the palette of Rogue One?
I’m a big fan of world cinema, and I’m a big fan of ‘70s cinema. I love Andrei Tarkovsky. I think the way that he makes movies is so beautiful, and so strong. But I also love the way that Kathryn Bigelow shoots her films. I love The French Connection, and the way that it was shot. For Rogue One, we mined the depths of our interests, and the types of films that we loved to watch. Lawrence of Arabia was another influence. These massive, David Lean-style battles. These big frames, and tracking shots, and static shots. Then you combine that with modern-day filmmaking, which, if you look at the evolution of cameras, has changed drastically. Back in the 1950s and ‘60s, the cameras were much larger than they are today, and harder to move around. Therefore, films looked a certain way. When you get into the 1970s, when George Lucas was shooting Star Wars, there was not a lot of handheld in that film either. The cameras were not really malleable, and, stylistically, that wasn’t really what he was after anyway. What was good for us though is that we were able to combine our interests and influences. Gareth and I clearly love Star Wars, but that is not the only thing we’re influenced by. French cinema, documentaries, all of that played a part for us.

An image of Baz Malbus (Jiang Wen). Photo courtesy of Greig Fraser. All Rights Reserved.

Tell me about the early conversations around virtual production and LED walls on Rogue One, and how that got us to today with ILM’s StageCraft?
This is where having amazing partners, like Industrial Light & Magic and John Knoll, was very integral. What we were pitching was not a common thing. Emmanuel “Chivo” Lubezki had played around with something similar on the film Gravity, with putting actors in an LED box, but we were talking about putting people into ships and big environments. It all stemmed from a lighting problem, and the problem goes like this: “you’ve got somebody in an X-wing above a planet. We’ll use Earth as our stand-in for Scarif. You’ve got a sun source, you’ve got ambient light bounce from Earth, and then you have black space. When you’re in the atmosphere, you have all of this beautiful light coming from above, and below, and from your sun source. That type of scenario is really easy to light. But what happens when you’ve got no ambience above, some ambience below, and then a sun source? Now, imagine those lighting conditions, and pretend you’re in the cockpit of that X-wing, and you do a barrel roll. As you spin around, it’ll transition from light to shadow on your face and around the cockpit. To try and do that in a studio environment, with the lighting we have, is very difficult. You have to put diffusion on all sides to make it nice and soft, so when you sequence the lights over the top, you get the illusion of camera and lighting movement. But what happens when you push light through the diffusion? It bounces back from the other side. With that said, I needed a black side and a light side, but then, of course, that wouldn’t have worked for the barrel rolls, because the light would have needed to move. The one thing we had at the time that could account for all of this were LED screens. When the light turns off on an LED screen, it’s pitch black. It’s the perfect lighting tool for that type of thing. That then progressed into the next question, “if we’re going to use that tool, for that one instance, can it work for other scenarios? Like flying across Jedha, or soaring through the atmosphere of Scarif?” That’s where this tool, this LED volume, became immensely helpful. People like John Knoll, and the people at ILM, are extremely integral to getting the quality right for something like this. Good VFX can live or die by bad lighting. That’s why ILM’s StageCraft is such a powerful tool for DP’s. Because DP’s know, if you can get the lighting right, you’re halfway there to getting a good final image. 

The partial hull of a T-65B X-wing starfighter used for shooting on the set of Rogue One: A Star Wars Story.

That must have been exciting to figure out?
It was such a great project, because it really upheld the vision that George Lucas had for the future of filmmaking, the “stage of the future”. George theorized that, years down the road, there might come a time when a filmmaker could walk onto a stage, and they could project whatever they wanted up onto the walls, or that those walls could have color-changeable light. They wouldn’t have to light for it, they’d only need to flick a switch. That was the hopeful future that George was thinking about, and now, years later, ILM made that a reality with StageCraft. Filmmakers now have the ability to put any high fidelity, real-time image up on the LED volume. Rogue One was the proof-of-concept for lighting, and that evolved into what ILM, John Favreau, and the Lucasfilm team are doing on The Mandalorian, along with so many other exciting projects.

An early LED volume used on the set of Rogue One: A Star Wars Story.

George referenced a lot of things for his aerial combat, including old WWII gun camera footage. How did you approach the ships flying in Rogue One?
While we were shooting, it became obvious where the camera could be, and where it couldn’t be. In Star Wars, there were never any mid-shots of people sitting in cockpits. You don’t have Han Solo in a mid-shot, shooting from outside of the cockpit. You never had a camera floating in space for a shot like that. The camera was always fixed inside the cockpit, or super-wide. There was no in-between. It would never go from a super-wide, into a mid-shot, into a closeup. The only example of that might be the final shot of the Millennium Falcon, just before Lando departs the Medical Frigate, at the end of The Empire Strikes Back. With that said though, we tried to maintain those parameters for Rogue One, and we didn’t want the audiences to have to think about it. I haven’t spoken to George Lucas about it personally, and maybe if he would have had infinite resources he might have shot it differently, but we wanted our film to match A New Hope, and we loved the look. It built our visual understanding of what a Star Wars film should be.

Jon Vander’s “Gold Squadron” forms up as they prepare for their assault on the Shield Gate during the Battle of Scarif.

There’s something intimate about it. When I think about old WWII air combat movies, they did the same thing.
Exactly. And they were forced to shoot like that. You either had a camera in the cockpit, or a camera on another plane. You couldn’t get a plane in close enough to get a reaction from a pilot, or you’d have planes crashing into each other. It was either super-wide, or close. It was purely pragmatic. 

Red Twelve (Richard Glover) participates in the Battle of Scarif.

You did have a unique shot that was used a few times that I loved, and that was the one of the camera fixed on the X-wings and Y-wings, directly behind the astromech droid.
Gareth was clever, because even though we had these rules on how we would shoot the ships, we would work off moments from the earlier films to devise new things. There’s that shot of R2-D2 getting blown up in A New Hope by Vader in the Death Star’s meridian trench, and this was kind of an evolution of that shot, while still keeping one foot planted in that A New Hope aesthetic.

A T-65C-A2 X-wing starfighter drops out of lightspeed at the Battle of Scarif.

How did it feel with The Force Awakens shooting alongside your film, and to a degree, The Last Jedi too, when you were shooting pickups?
It was fun. We were all sharing buildings and in each other’s worlds. I’m such a big fan of Star Wars, and I could have walked on set and spoiled everything for myself, but I chose not to. I just wanted to enjoy them as a fan. I did have one thing spoiled for me… someone walked up and told me the scene regarding Han Solo, and my first reaction was, “how dare you do that to me! I wanted to see that in theaters!” [laughs]. We shared some crew from time to time, but we generally had blinders on for Rogue One. While they were making their films in the Skywalker Saga, decades in the future, we were leading right into A New Hope, so ours was almost the equivalent of a period film, in our language. I found that to be very exciting.

Greig Fraser on the set of Rogue One: A Star Wars Story.

What’s your favorite shot, moment, or sequence in the film?
One of my favorites is that wide tracking shot of Jyn Erso (Felicity Jones) making her way through the Massassi outpost on Yavin 4 after she’s “rescued” from the Wobani Labor Camp. I also love the final sequence with Vader aboard the ‘Tantive IV’. When Gareth rang me to tell me we were going to do that, I was ecstatic. It’s such a wonderful sequence. We had the time to prepare it properly. We had the time to rehearse all the action, and to do the lighting tests. We also spent a lot of time figuring out how best to light Vader. As a kid in a grown man’s body, that blew me away. Vader, this dark “shape”, terrified us as kids. It was a dream come true to add to his iconography. I felt very honored and very blessed. Another moment I loved was seeing the full-sized X-wing props in person for the first time. I was transported back to being a kid again, playing with my toy X-wings, but then, of course, my filmmaker brain would kick on, and let me tell you, moving full-sized X-wings around on a set is pretty difficult [laughs].

Vader ignites his lightsaber in an attempt to capture the stolen plans to the Death Star aboard Admiral Raddus’ star cruiser.

I love the sequence you shot in Iceland of Orson Krennic and the Death Troopers making the long trek up to the Erso homestead from the shuttle. His cape flapping in the wind, it was incredible.
I love that shot too. An interesting thing about that sequence is how we found that location. In that part of Iceland, there’s all of this black sand, so they plant this weed to prevent it from blowing onto the roads and destroying the cars. It’s basically useless outside of keeping the sand from blowing about. We found that location on Google Earth while we were driving around, location scouting. I thought it looked so unusual and interesting. As soon as we dropped the moisture vaporators in, those weeds started looking like crops that the Erso’s were farming, and it instantly became Star Wars

Director Krennic and his personal attachment of Death Troopers storm the Erso homestead on Lah’mu.

In a new “Behind the Magic” video released on YouTube and Instagram, enjoy a glimpse behind the virtual production of NBC Sports’ Sunday Night Football show opening, featuring country music star Carrie Underwood. 

“This was yet another successful demonstration of the end-to-end services available through ILM’s virtual production platform, ‘StageCraft™’,” says Chris Bannister, Executive Producer of Virtual Production at Industrial Light & Magic. “Partnering with the creative team from art concept all the way through principal photography, we were able to offer both the creative resources and real-world virtual production experience that maximized the scope and results for the project in a way that only ILM StageCraft can deliver.”

Shot on the StageCraft LED volume by Industrial Light & Magic.

The show opening is the key introduction each week to NBC’s flagship sports broadcast, and each year the creative team at NBC looks for innovative ways to top itself. 2021 was no exception, as they ideated ways to push the boundaries beyond the green screen and inject a new layer of authenticity and integration into the opener. That’s where ILM StageCraft came in.

“It was particularly important to Tripp Dixon and his creative team at NBC Sports to celebrate NFL fans coming back together,” notes Jonathan Howard, Associate Virtual Production Manager at ILM. “This unique opportunity allowed ILM to showcase both the agility, and production-hardened scalability of StageCraft 2.0, evident in the team’s ability to adapt the platform to the compressed schedule of a broadcast package.”

Shot on the StageCraft LED volume by Industrial Light & Magic.

Across the entire production, ILM was able to find unique ways to match the energy and excitement that Sunday Night Football fans are used to, while also expanding upon it in distinct ways. “This was such a rare creative project for me, because I’m typically working with creatures, droids, and spaceships,” said Hal Hickel, Animation Supervisor at ILM. “It was fun in that way though, because it got me out of my wheelhouse, while also allowing me to craft some exciting elements in a grounded production.” 

What makes StageCraft’s application for the Sunday Night Football show opening different from previous applications of the technology, is that this project is designed to look both indistinguishable from the real world, and also fantastical in its execution. Hayden Landis, Visual Effects Supervisor at ILM explains, “We had some incredible streaming elements like fireworks, along with dynamic moving components that we’ve never used before on the volume. Between the creative and technical wizardry that the StageCraft crew conjured up on the day, and the passionate support of the NBC Sports team, I think we really created something special.”

Shot on the StageCraft LED volume by Industrial Light & Magic.

Even with all the magic happening on screen, it can be easy for viewers to miss StageCraft’s sleight of hand because it is so convincing. Hal Hickel elaborates, “To let them backstage in a creative way, we came up with the idea to have Carrie enter the studio in one take, walk up onto the set, and then have the entire StageCraft Volume power-up around her. That small addition really drove home the magic of StageCraft.”

Check out the “Behind the Magic” video below, and don’t miss the show opening this Sunday, October 17 at 5:20pm PST on NBC as the Seattle Seahawks face off against the Pittsburgh Steelers.

Behind the Magic – Sunday Night Football

The Hollywood Professional Association announced the nominees for its annual HPA Awards for post-production, an honor that promotes outstanding creative artistry, and recognizes the achievement of talent, innovation, and engineering excellence. ILM is thrilled to have contributed to three shows nominated in the Outstanding Visual Effects category this year. Nominees include Richard Bluff, Hal Hickel, Jeff Capogreco, Abbigail Keller, and Joe Bauer for The Mandalorian – “Chapter 9: The Marshal”, David Seager, Alexandra Greene, George Kuruvilla, Dan Mayer, and Dan DeLeeuw for Loki – “Journey Into Mystery”, and Chad Wiebe for his work on Jungle Cruise.

“It is an absolute honour to have been nominated for our work,” said Alexandra Greene, Visual Effects Producer at ILM. “It’s hard to put into words the gratitude I have for all the ILM artists and production crews who poured their heart and souls into bringing the “Void” to life on Loki, along with all of the larger-than-life creatures that reside there. Every day I find myself amazed by both the innovation and creativity that comes from our teams here at ILM, including the work by our fellow ILM nominees for The Mandalorian, and Jungle Cruise. Congratulations are in order!”

Janet Lewin, ILM’s General Manager and Senior Vice President notes, “I could not be more proud of the nominees and their teams that worked on these incredible shows,” adding, “I’m continually in awe of our team’s technical ingenuity, imagination, and relentless spirit, and I’m so pleased to see their hard work recognized by the HPA.”

The annual HPA Awards are returning as an in-person event this year, presented at a live gala on Thursday, November 18th at the historic Hollywood Legion Theater. Tickets are on sale now.

In a new video released by ILM on our YouTube channel, join Visual Effects Supervisor, Richard Bluff, as he shares a peek behind the curtain of the effects of The Mandalorian: Season 2, winner of 7 Emmy® Awards including Special Visual Effects, Sound Mixing, Cinematography, Prosthetic Makeup, Stunt Coordination, Stunt Performance, and Music Composition.

For its sophomore outing, Lucasfilm’s hit Disney+ series built upon the groundbreaking technical and artistic achievements accomplished during season one, combining traditional methodologies, with ever-advancing new technologies. The team also increased the physical size of the ILM StageCraft™ LED Volume which would again be used for over half of all scenes. This season also marked the debut of ILM’s state-of-the-art real-time cinema render engine called, Helios. The high-resolution, high-fidelity engine was used for all final pixel rendering displayed on the LED screens and offers unmatched performance for the types of complex scenes prevalent in today’s episodic and feature film production.

Practical creature effects have been a vital part of the aesthetic and charm of the Star Wars universe since 1977, and for season two, the effects team realized over 100 puppeteered creatures, droids, and animatronic masks, which included the beloved Tatooine Bantha, realized as a ten-foot-high puppeteered rideable creature. 

Practical miniatures and motion control photography were used once again for scale model ships, as well as miniature set extensions built for use in ILM’s StageCraft LED volume. Stop-motion animation was also utilized for the Scrap Walker at the Karthon Chop Fields. The greater Krayt dragon on Tatooine was realized as a six-hundred-foot computer-generated creature that would swim shark-like through the sand environment by way of a liquefaction effect, wherein the sand would behave like water. 

We would like to acknowledge the care and dedication that the team here at ILM put into the show, along with our partners at Legacy Effects, Hybride, Image Engine, Important Looking Pirates, Ghost VFX, Lola, Stereo D, Tippett Studios, Base FX, Raynault, Virtuous, and Yannix. 

We hope you enjoy this look inside The Mandalorian: Season 2.

The Jedi Academy is a unified, global, 12-week junior talent paid internship and trainee program at Lucasfilm, Industrial Light & Magic, and ILMxLAB created for students and graduates. The program is a once-in-a-lifetime experience to learn in a dynamic and creative production environment, focused on developing the next generation of diverse talent across art, public relations, and technology. 

After playing Vader Immortal, I knew that I wanted to help make those kinds of games and tell those kinds of stories,” said Gary Walker, intern at ILMxLAB. “So if you want to do something, go for it. Ask how you can get there because there are people willing to help you if you’re willing to go out and you’re willing to do it.”

Jedi Academy interns are able to gain valuable, real-world experience through hands-on training and mentorship across day-to-day production work. Trainees also gain valuable skills through intensive classes and immersive learning modules taught by industry experts from a variety of disciplines. The trainees are exposed to fundamental artistic concepts as well as key business skills that support their transition into the industry. 

“Coming into this I was very interested in a lot of things; VR, animation, video production, film production,” said Jared Tan, Video Production Intern at Lucasfilm. “And now coming out of the internship, I know what skills I need to polish so hopefully one day I can come back here to work and help this ecosystem of filmmakers and creative people at this amazing company.”

Lucasfilm is committed to improving the diversity of our studios, and programs like our Jedi Academy help us provide opportunities to a broad range of applicants at the start of their careers. The experience is perhaps best described by Alexandria Frank, Studio Talent Group Intern at Lucasfilm, “Just the sheer intention and passion that comes with everyone working here, it radiates through everything.”

The most recent Jedi Academy interns for ILM focused on virtual production and the company’s StageCraft technology, an ever-growing part of the company’s business. The company is preparing to launch another Jedi Academy focused on the San Francisco and Vancouver studios soon.

Would you like to Join the Force? Keep your eyes peeled on our Careers page for when we announce our next Jedi Academy.


The Television Academy announced its winners for the 73rd Annual Primetime Creative Arts Emmy® Awards over the weekend, celebrating a diverse group of talent from across television. ILM’s creative teams were honored with an award for Outstanding Special Visual Effects in a Season or a Movie for The Mandalorian, alongside nominations for their contributions on The Boys, WandaVision, and The Falcon and the Winter Soldier.  This is the second year in a row that the hit Lucasfilm series has received the Emmy Award for Special Visual Effects, a testament to the groundbreaking work that the show is known for.

Animation Supervisor Hal Hickel, VFX Producer Abbigail Keller, VFX Supervisor Joseph Kasparian (Hybride), and Environments Supervisor Enrico Damm in attendance at the 73rd Annual Primetime Creative Arts Emmy® Awards.

The visual effects team representing this win, included: Joe Bauer, Richard Bluff, Abbigail Keller, Hal Hickel, Roy K. Cancino, John Knoll, Enrico Damm, John Rosengrant, and Joseph Kasparian. Special recognition is also in order for ILM Producer Stacy Bissell, ILM Animation Supervisor Paul Kavanagh, and the entire StageCraft team for their incredible contributions.

Richard Bluff, Visual Effects Supervisor on The Mandalorian, added, “I want to take this opportunity on behalf of the visual effects team to congratulate all the artists, production, and the technical support staff who contributed to the Visual and Special Effects on season two of The Mandalorian. We continue to be in awe of the spectacular work and the effortless partnerships we enjoy with all of our vendor partners. The time and effort invested in the visuals by ILM, Hybride, Image Engine, Important Looking Pirates, Ghost VFX, Lola, Stereo D, Tippett Studios, Base FX, Raynault, Virtuous, and Yannix has been exceptional and this recognition is fully deserved. Everyone associated with the show couldn’t be happier or more grateful for what we all achieved together.”

In addition, The Mandalorian was also recognized with Emmys in the following categories: Sound Mixing, Cinematography, Prosthetic Makeup, Stunt Coordination, Stunt Performance, and Music Composition.

The 73rd Emmy Awards will be hosted by Cedric the Entertainer at  L.A. Live in Downtown Los Angeles, California. Executive Producers Reginald Hudlin and Ian Stewart and Director Hamish Hamilton have been selected to helm the show for production companies Done+Dusted and Hudlin Entertainment. Highlights from the 73rd Annual Creative Arts Emmy Awards will be broadcast on Saturday, Sept. 18 (8:00 PM ET/PT) on FXX. The 73rd Primetime Emmy Awards will be broadcast live on Sunday, Sept. 19 (5:00-8:00 PM, PST) on the CBS Television Network, and as well as streaming live and on-demand on Paramount+.

On this day, 55 years ago, Star Trek was born; a franchise that represented the hope of what space—the final frontier—could mean for all of humanity. ILM has played a significant part throughout Star Trek history, including the creation of the first completely computer-generated cinematic image sequence in a motion picture. Read on to learn about other exciting work we’ve brought to life for Star Trek.

Nearly a year before Return of the Jedi wrapped production, Industrial Light & Magic began work on another famous space film: Star Trek II: The Wrath of Khan. ILM crafted many of the effects for the motion picture and even built scale models of both the ‘Enterprise’ and the ‘Reliant’: the first non-Constitution-class Federation starship ever seen in the series. As the script called for the Reliant and Enterprise to deal out significant damage on one another, ILM developed techniques to convincingly simulate the destruction without physically damaging the delicate models. Rather than move the models on blue screen during shooting, the VistaVision camera was panned and tracked to give the illusion of movement, a technique that ILM pioneered for the Star Wars trilogy, and further refined during seasons 1 and 2 of The Mandalorian. One of the most groundbreaking sequences of the film was ILM’s animation for the demonstration of the Genesis Device on a barren planet. The first concept for the shot took the form of a laboratory demonstration, where a rock would be placed in a chamber and turned into a delicate flower. Effects supervisor, Jim Veilleux, wanted the sequence’s size and scope expanded to show the Genesis effect taking over an entire planet; a challenge that ILM’s Computer Graphics group was up for. The team introduced the novel technique of “particle systems” for the sixty-second sequence, going so far as to ensure that the stars visible in the background matched those visible from a real star that was light-years from Earth. The animators hoped it would serve as a calling card for the studio’s talents. Their hard work paid off, as the group would later be spun off and become the foundation for Pixar Animation Studios.

The first completely computer-generated cinematic image sequence in a motion picture. The Genesis Device terraforming a barren planet in Star Trek II: The Wrath of Khan (1982) directed by Nicholas Meyers.

For Star Trek III: The Search for Spock, ILM was tapped early in the production, allowing visual effects supervisor Ken Ralston and his team additional time to plan their shots. Leonard Nimoy credited the early involvement of ILM with expanding the creative input into the film’s design and execution. It became apparent to ILM early on that The Search for Spock required far more design and model work than had been required for The Wrath of Khan. The Earth Spacedock, for example, was a design intended to expand the scope of Star Trek. After approving a small three-dimensional maquette of the final design, ILM created a spacedock model that was over six feet tall. Rather than embarking on the painstaking process of wiring thousands of fiber optic lights, ILM had the innovative idea to construct the model out of clear plexiglass and then paint it. Once complete, the painted finish was etched-off in sections, creating the illusion of windows, with an inner core of neon light illuminating the resulting holes. The interior of the dock was simulated by an additional model that measured 20 feet in length, with a center section that could be removed. The interior illumination was generated by fiber optics and powerful lighting elements.

VFX Supervisor Ken Ralston and the team at Industrial Light & Magic readies the Klingon Bird-of-Prey for Star Trek III: The Search for Spock (1984) directed by Leonard Nemoy.

Finding himself in the director’s chair a second time for Star Trek IV: The Voyage Home—and building on the visual effects success found during production on The Search for Spock—Leonard Nimoy and the production team once again approached ILM early in development to help create storyboards for the complex optical effects sequences required. Many shots throughout the film were brought to life through matte paintings, both as a way to extend backgrounds, but also for establishing shots which greatly cut down on cost, compared to the process of building sets from scratch. Matte Supervisor, Chris Evans, attempted to create paintings that felt less contrived and more real, in stark contrast to the natural instinct of filmmaking which is to place important elements in an orderly fashion. Evan’s reasoning was that photographers would tend to “shoot things that were odd in some way”, and the final result would end up looking far more natural. The task of establishing the location and atmosphere at Starfleet Headquarters also fell to Chris and his team, along with famed artist Ralph McQuarrie, who had the daunting task of making San Francisco feel both teeming and futuristic, while still “of-our-world”. The scenes of the Klingon Bird-of-Prey on Vulcan were created through combinations of live-action footage—actors on a set in the Paramount parking lot which was covered in clay and used backdrops—matte paintings for the ship itself, as well as the rocky background terrain. The scene of the ship’s departure from Vulcan for Earth was more difficult to accomplish; the camera pans behind live-action characters to then follow the ship as it leaves the atmosphere, while other assets like flaming pillars and a blazing sun had to be integrated into the shot.

ILM Miniatures Camera Operator Pat Sweeney takes a light meter reading for the USS Enterprise on the stage during production on Star Trek IV: The Voyage Home (1986) directed by Leonard Nemoy.

With Nicholas Meyer back at the helm for Star Trek VI: The Undiscovered Country, The majority of the visual effects again fell to the pioneering team of ILM, this time under the supervision of Scott Farrar (who had previously served as a visual effects cameraman on the first three Star Trek films), as well animator Wes Takahashi. ILM’s computer graphics division was called up again for the creation of three sequences, including the explosion of Praxis. Meyer’s idea for the effect was influenced by the immense wave hitting the ship in The Poseidon Adventure to inform the scale of their shock wave. To accomplish this, the team at ILM built a lens flare simulation of a plasma burst composed of two expanding disc shapes. They then layered-in swirling details that were texture-mapped to the surface. Farrar settled on the preliminary look of the wave, and graphics supervisor Jay Riddle used Adobe Photoshop on his Apple Macintosh to build the final color scheme for the effect. For the wave that hit the ‘Excelsior’, the ILM team pulled out all the stops, because in Riddle’s words, “this thing had to look really enormous.” The team manipulated two curved pieces of computer geometry, expanding them as they approached the camera’s view. Textures that changed every frame were added to the main wave body and then looped over the top of it to create the sensation of blistering speed. Motion control footage of the Excelsior was then scanned into the computer system and made to interact with this digital wave. The results were extraordinary. In fact, ILM’s ring-shaped “Praxis Effect” shockwave has gone on to become a commonplace feature in science fiction films depicting the destruction of massive objects.

The USS Enterprise-A departs Earth Spacedock in Star Trek VI: The Undiscovered Country (1991) directed by Nicholas Meyer.

As the franchise shifted back to television, ILM was once again counted on to deliver on their model-making and visual effects expertise, including the design and construction of the ‘USS Enterprise-D’ for Star Trek: The Next Generation. The models were exceptionally detailed, and included both two foot and six foot versions. ILM was also integral to the development of the “jump to warp” special effect, which was a cornerstone of the series throughout its nearly seven-year run.

Anticipation was high for the legendary team-up of William Shatner and Patrick Stewart on the motion picture Star Trek: Generations, and ILM was up for the task. CG supervisor John Schlag recalled that it was easy to enlist ILM staff members who wanted to come and work on Star Trek, “it gave me a chance to be a part of the whole Trek thing, not to mention ILM is practically an entire company filled with Trek geeks”. Meanwhile, effects supervisor John Knoll’s team was charged with storyboarding the elaborate effects sequences. Previous Star Trek films had used conventional motion control techniques to film multiple passes of the starship models and miniatures on tracks. For Generations, the ILM artists began using computer-generated imagery and models for certain shots, a methodology that they were becoming well-versed in. No physical shooting models were built for the refugee ships, and other memorable CG elements included the solar collapses and the Veridian III planet. John Knoll and his team used a digital version of the ‘Enterprise-D’ for the warp effect, allowing them to keep consistent lighting throughout. While digital techniques were used extensively throughout the picture, ILM kept one foot firmly planted in its roots by way of the scale miniature of the observatory, built by model shop foreman John Goodson.

Industrial Light & Magic’s John Knoll works on a sequence featuring the USS Enterprise-D for Star Trek: Generations (1994) directed by David Carson.

As Jonathan Frakes took up the reins on Star Trek: First Contact, ILM was brought on to develop the new Sovereign-class ‘Enterprise E’. Designed to be leaner, sleeker, and mean enough to answer any Borg threat that the Federation would encounter. ILM fabricated a nearly eleven-foot miniature over a five-month period. Hull patterns were carved out of wood, then cast and assembled over an aluminum armature. The model’s panels were painted in an alternating matte and gloss scheme to add texture. The team also cut the windows using a laser, and then inserted slides of the sets behind the window frames to make the interior seem more three-dimensional when the camera tracked past the ship. In previous films, Starfleet’s range of capital ships had been predominantly represented by the Constitution-class ‘Enterprise’ and just five other ship classes, but for First Contact, the team created five new ship classes. ILM VFX supervisor John Knoll insisted that First Contact‘s space battle show the full armament of Starfleet’s ship configurations. He reasoned, “Starfleet would probably throw everything they could at the Borg, including ships we’ve never seen before”. ILM was also tasked with imagining what the Borg assimilation of a Starfleet crew member might look like. Visual effects art director Alex Jaeger came up with a set of cables that sprang from the Borg’s knuckles and buried themselves in the crewmember’s neck. Wormlike tubes would burrow through the victim’s body and mechanical devices would break the flesh. The entire transformation was created using computer-generated imagery. The wormlike geometry was animated over the actor’s face, then blended in with the addition of a skin texture that was layered over the animation. The gradual change in skin tone was then simulated with shaders.

Painter Kim Smith applies finishing touches to the USS Enterprise-E miniature for Star Trek: First Contact (1996) directed by Jonathan Frakes.

J.J. Abram’s Star Trek marked the first Trek film ILM worked on that was composed entirely of digital ships. Modeled by ILM’s Bruce Holcomb, the ‘USS Enterprise’ for this film was intended by Abrams to be a merging of its design in the original series and the refitted version from the original film. Abrams had fond memories of the revelation of Enterprise’s refit in Star Trek: The Motion Picture, because it was the first time the ship felt tangible and real to him. The iridescent pattern on the ship from The Motion Picture was maintained to give the ship depth, while ILM texture artist John Goodson artfully applied the “Aztec” interference pattern from The Next Generation. Goodson recalled Abrams also wanted to bring a “hot rod” aesthetic to the new Enterprise, which greatly influenced the end-design. Effects supervisor Roger Guyett also added more moving parts to the ship, including a new dish that would expand and move, as well as fins on the nacelle engines that would split when the ship would jump to warp. Carolyn Porco of NASA was consulted by ILM on planetary science and imagery. ILM’s technical team developed a new digital pyro tool allowing animators to realistically recreate what an explosion might look like in space: short blasts which suck inward, leaving debris from a ship floating. For the elaborate sequence of the imploding planet Vulcan, ILM used the same explosion tool to simulate its break up, allowing the animators to manually add layers of rock debris and wind swirling into the planet. 

The USS Enterprise departs Starbase 1 and prepares for its warp to Vulcan in Star Trek (2009) directed by J.J. Abrams.

For J.J. Abrams return on Star Trek: Into Darkness, ILM contributed over 700 of the motion picture’s visual effects shots including the death-defying scenes of the volcanic planet Nibiru erupting in the opening sequence, the secret Federation ship, ‘Vengeance’, and its cataclysmic destruction as it crashes into the San Francisco Bay and tears through the city. ILM also created the iconic chase between Spock and Khan through the streets of San Francisco, the flying truck sequence, and also further refinement to the iconic ‘Starship Enterprise’. Guided by visual effects art director Alex Jaeger, ILM’s model, texture and lighting team created hundreds of buildings to fill out San Francisco, creating a living, breathing metropolis with a striking sense of design and vision for the future. As a nod to the importance that the team at ILM placed on Star Trek, the location of Starfleet’s San Francisco headquarters in the film was situated in the exact real-world location of ILM’s headquarters at the Presidio.

James T. Kirk (Chris Pine) and Leonard McCoy (Karl Urban) escape the volcanic eruption on Nibiru in Star Trek: Into Darkness (2013) directed by J.J. Abrams.

As we celebrate 55 years of Star Trek, it has been our honor here at Industrial Light & Magic to help filmmakers from across the franchise explore strange new worlds. To seek out new life and new civilizations. To boldly go where no man has gone before. So from all of us here at ILM, we extend a heartfelt Vulcan salute to the fans around the world: live long, prosper, and happy Star Trek Day!

The Academy Software Foundation (ASWF), the preeminent foundation for open source software development in the motion picture and media industries, today announced that MaterialX has been accepted by the Academy Software Foundation’s Technical Advisory Council (TAC) as the seventh Foundation-hosted project.

MaterialX originated at Lucasfilm in 2012, and it has grown into the central format for material description at Industrial Light & Magic (ILM) since the production of Star Wars: The Force Awakens. The project was released as open source in 2017, with companies including Sony Pictures Imageworks, Pixar, Autodesk, Adobe, and SideFX contributing to its ongoing development. In recent years, MaterialX has been incorporated into widely-used applications and standards including Maya, 3ds Max, Substance Designer, Arnold, Renderman, Autodesk Standard Surface, and Universal Scene Description.

“We initially developed MaterialX to solve a need we had across Lucasfilm and Industrial Light & Magic to have truly portable assets, with look-development information that could be shared across applications, both for real-time and offline, but it’s been even more exciting to see how it’s been embraced by software vendors and pipeline developers alike. With MaterialX becoming an official ASWF project, we look forward to seeing that momentum continue to grow and help solve one of our biggest industry challenges,” said Francois Chardavoine, VP of Technology at Lucasfilm and ILM.

“Integration into the Academy Software Foundation marks an important step forward for the MaterialX project, broadening the lines of communication with closely-related standards such as OpenColorIO and Open Shading Language, and providing a strong platform for new studios, companies, and teams to contribute to MaterialX in the future,” said Jonathan Stone, Senior Software Engineer at Lucasfilm’s Advanced Development Group and the lead developer of MaterialX.

“MaterialX is a crucial piece of technology as it addresses an industry pain point of smoothing the transfer of look development information between various applications and renderers. MaterialX solves the need for a common, open standard in this space and represents an enormous value to end-users within animation studios, visual effects studios, and third-party vendors,” said David Morin, Executive Director of the Academy Software Foundation. “With the support of the broader Academy Software Foundation community, we hope the ecosystem that supports MaterialX will grow, further validating the open standard within the industry.”

MaterialX is a key technology in the representation of materials in content pipelines for computer graphics. Its capabilities for expressing physically based shading models and generating shading code have strong synergy with the ASWF’s Open Shading Language, and its interpretation of color spaces is closely aligned with the approach in OpenColorIO and ACES.

The Academy Software Foundation will maintain and further develop MaterialX with oversight provided by a Technical Steering Committee, including members from Lucasfilm, Pixar, Sony, Autodesk, Adobe, Foundry, SideFX, NVIDIA, and Epic Games. All newly accepted projects start in incubation while they work to meet the high standards of the Academy Software Foundation and later graduate to full adoption. This allows the Academy Software Foundation to consider and support projects at different levels of maturity and industry adoption, as long as they align with the Foundation’s mission to increase the quality and quantity of contributions to the content creation industry’s open source software base.

Learn more about MaterialX and get involved:

Project Site: http://www.materialx.org/

GitHub: https://github.com/materialx 

Twitter: https://twitter.com/MaterialXcg.

Launchpad, a first-ever live-action shorts incubator program that provides the opportunity for filmmakers from underrepresented backgrounds to produce live-action short films for Disney+ enlisted the talent at ILM to provide visual effects for the films. For its part, ILM used the project as an opportunity to provide up-and-coming talent from underrepresented groups a chance to expand their experience to higher-level roles.

The visual effects for American Eid, The Last of the Chupacabras, Dinner is Served, and Growing Fangs, each centering on the theme of “discover” were supervised by Beth D’Amato. A 24-year veteran of ILM, D’Amato has previously been an artist and supervisor on dozens of films including, the Star Wars prequels, Star Trek, Lucy, Rogue One: A Star Wars Story, The Revenant, Captain Marvel, Jurassic World, and Black Widow. Also contributing to the project in the role of visual effects producer was Shivani Jhaveri who had previously been a production manager on Lucasfilm’s hit Disney+ series The Mandalorian and Star Wars: The Rise of Skywalker.

“This project was special not only because it offered amazing opportunities for underrepresented filmmakers to tell their stories, but that diversity went so much farther than what you see on screen,” explained Beth D’Amato, adding “On set, there were so many women and people of diverse backgrounds on the crew as well, and the bond we felt because of it was palpable. We all knew we were a part of something truly unique.”

Shivani Jhaveri said, “Disney Launchpad was a project that showcased a unique variety of voices throughout the scope of the show. It was truly a pleasure working with filmmakers who were so closely connected to their stories to help them elevate their narratives through visual effects. We were lucky to have directors that could think outside the box and work with us to accomplish creative solutions within our time and budget considerations.”

Unlike many studio diversity initiatives, Disney Launchpad is not aimed at first-time directors, but rather experienced filmmakers poised to make the leap to studio filmmaking. Disney’s stated goal for season 1 of the Launchpad Shorts Incubator was to tell six deeply meaningful personal stories straight from the filmmakers’ heart, amplified with the scale and reach that only Disney has. The program returns for a second season with some exciting new changes. For the first time ever, writers will be eligible to apply for the program as well as directors. This means filmmakers can apply as a writer, writer/director, a director with a script someone else wrote (so as a team), or as a director. Applications opened May 10th and will close on June 11th, with the program start in December of 2021.

Inside ILM: Creating the Razor Crest, received the Silver Telly Award at the 42nd Annual Awards for the short-form non-broadcast documentary category. The Telly Awards recognize creators who are bringing high-caliber stories to screens globally. The mini-documentary film chronicles the design, development, and construction of the miniature Razor Crest ship created for season one of Lucasfilm’s hit Disney+ series The Mandalorian. The piece also delves into the return of motion control photography, a technique that hadn’t been used at the company in over a decade despite ILM’s innovation in the space during the company’s first two decades in business.

ILM visual effects team members (from left) Landis Fields, John Knoll, and John Goodson ready the miniature Razor Crest for a motion control shot on the ILM stage as Chris Hawkinson (operating camera) documents the action.

Directed by ILM’s Chris Hawkinson, the short features The Mandalorian creator and executive producer, Jon Favreau, and such luminaries as Doug Chiang, Rene Garcia, Ryan Church, Jay Machado, Richard Bluff, John Knoll, Landis Fields, Marissa Gomes, John Goodson, Lorne Peterson, Bill George, and Hal Hickel.

The Telly Awards annually showcases the best work created within television and across video, for all screens. Receiving over 12,000 entries from all 50 states and 5 continents, Telly Award winners represent work from some of the most respected advertising agencies, television stations, production companies, and publishers from around the world. The Telly Awards recognizes work that has been created on the behalf of a client, for a specific brand and/or company, or self-directed as a creative endeavor.

You can check out the documentary below.

We were greatly saddened to learn that over the weekend that our friend and colleague, David Owen, a still photographer and documentarian at ILM for nearly 25 years had passed away. David joined ILM in 1988 having relocated from his native England. He was one of a small cadre of people that worked behind the scenes in ILM’s Still Photo Department documenting the company’s work over the years. It wasn’t so much the work he documented, but the people. He loved making images that told a story – Snapshots in time of what the amazing talent gathered around him were creating.

We often hear from people who say that the books chronicling ILM greatly impacted them and inspired many to want to get into the industry. ILM Still Photographer David Owen was a tremendous contributor to our second book, Into the Digital Realm, as well as our third book, Industrial Light & Magic: The Art of Innovation providing invaluable research assistance and of course much of the photo documentation for the projects featured in the books. He documented ILM’s work on a great many projects including Ghostbusters II, Back to the Future II, Death Becomes Her, Casper, Star Trek: First Contact, Indiana Jones and the Kingdom of the Crystal Skull, The Lone Ranger, and many others. While we all work behind the scenes at ILM, some work a bit further back than others. David captured many of the iconic behind-the-scenes moments be they in the model shop, on stage, or with the digital teams that help tell the story of what the talented teams at ILM do.

David was often tasked with going on set to shoot reference photography of wardrobe, props, and the actors themselves, in the process of making and cataloging thousands of images that would be used by artists back at the studio to recreate the clothing, objects, and people down to the smallest detail. On set he was the consummate professional and possessed a quiet demeanor that managed to make his sense of humor all the funnier when his dry wit would come through.

Texture Supervisor and frequent collaborator of David’s, Jean Bolte, shared, “Some of us at ILM do work that ends up on the screen. We all strive to make it look as good as possible. If we succeed, it’s due in large part to those who have worked hard behind the scenes to make it happen. David Owen was one of those people,” she continued, “over the years I watched him shoot calmly and professionally, photographing reference on everything from the inside of a dumpster, to live bats, to actors such as Arnold Schwarzenegger. After he set his lights and triple-checked his equipment, there was a story to tell, keeping us entertained. Thanks, Dave, you made us look good.“

It is with great sorrow that we say goodbye to David but we know that his work lives on in the images he made and those and will continue to inspire for generations to come.

Long before the precision of computer-aided design and the perfection afforded by 3D printing, there was Ira Keeler. In the world of special effects miniatures and modelmaking, Ira stood out as a master craftsperson possessing the uncanny ability to sculpt even the most complex shapes from a block of solid wood with near-perfect symmetry. He could often be found at his work table in the model shop chewing on his pipe, wood shavings piled ankle-deep around him as his hand traced the surface of the curve of an aircraft fuselage or the cockpit of formula one racer, seemingly able to detect the most minute fluctuation in surface detail. From his tools, he’d grab a tiny block plane, a chisel, an X-acto, and a piece of sandpaper and methodically remove any excess material until his trained hands could no longer detect a variance.

Ira contributed to many, many films during his time with ILM. Joining the company in 1982, he sculpted such iconic vehicles as Doc Brown’s DMC-12 DeLorean for the Back to the Future trilogy, the X-2 aircraft for Space Cowboys, the saucer pattern for the Enterprise E for Star Trek: First Contact, and a host of others too numerous to mention. He contributed to the special effects magic of films such as three of the Star Trek series, The Rocketeer, Roger Rabbit, Always, the original Indiana Jones trilogy, Jurassic Park, Men in Black, and Starship Troopers, in addition to other projects such as commercials, and the original Star Tours attraction at the Disney Parks.

“More than anyone I’ve ever worked with, Ira had an innate talent for creating exacting patterns with a replication of detail down to the smallest element,” noted John Goodson who worked with Ira on numerous projects in the model shop over the years. “He had the capacity to understand three-dimensional space in a way few others do and not only that but he could replicate in any scale needed.”

ILM Visual Effects Supervisor Bill George who worked in the model shop aside Ira as a model maker for decades noted, “Ira was a master and all of us in the model shop were in awe of his skill. He could carve the most complex patterns out of wood and make it look effortless. Many of his patterns were kept and displayed after their initial use because they were so beautiful and impressive. He contributed so much to so many classic ILM model projects. Ira was a lovable gentleman with a sly sense of humor and a big heart.”

Ira’s artistry also left an indelible mark on a galaxy far, far away. For Star Wars: Return of the Jedi he not only sculpted but helped creatively interpret the designs for the Red Guards, numerous weapons, and some of the beloved spacecraft as well. He sculpted the Scout Trooper helmets with such precision that Lorne Peterson, longtime Model Shop Supervisor noted, “with the most basic of carving hand tools Ira could achieve sculpts and create patterns for us that would rival what others could do when given twice the time with all of our modern power tools at their disposal. It was really remarkable. He was a true artisan, and a quiet, kind soul that I’ll never forget.”

Ira’s contribution to Star Wars continued with his work on the prequel Episodes I and II after which he retired from the model shop and dove headlong into passion projects and hobbies such as, designing and hand-crafting beautiful model rockets and restoring model trains to absolute perfection for collectors the world over.

To know Ira was to love Ira. His generosity in sharing his craft and his decades of experience with any artist that showed interest was a hallmark of what made and continues to make ILM, ILM. Ira’s DNA, like so many amazingly talented individuals who have done some of their life’s best work at ILM, continues to touch every project the studio contributes to, and for that, we are honored. Ira is survived by his wife Joy, their two children Dawn and Shawn, and grandchildren Jessica and Matthew and will be missed dearly.

For the second season of Lucasfilm’s hit Disney+ series, The Mandalorian, Industrial Light & Magic reengineered their StageCraft virtual production platform rolling out version 2.0 in which ILM introduced among other things, Helios, Industrial Light & Magic’s first cinematic render engine designed for real-time visual effects. Engineered from the ground up with film and television production in mind, Helios offers incredible performance, high fidelity real-time ray tracing, the ability to rip through scenes of unparalleled complexity, all while leveraging ILM’s unrivaled color science, and was designed from the start to work seamlessly with ILM StageCraft.

The purpose-built, production-hardened platform allows filmmakers to explore new ideas, communicate concepts, and execute shots in a collaborative and flexible production environment.

Check out the featurette on our YouTube channel.

“I strive to make it easier to innovate — to create a supportive environment for groundbreaking creativity and excellence in production,” explains Janet Lewin, SVP and general manager of Industrial Light & Magic.

Formerly vice president of Visual Effects at Lucasfilm, Lewin has spent a combined 26-years at the two companies. She currently oversees the visual effects and StageCraft business at Lucasfilm as well as ILM’s five studios, focusing primarily on operations and production. Lewin is an experienced executive and producer with numerous credits to her name, most recently serving as a producer on both seasons of Lucasfilm’s ground-breaking hit series, The Mandalorian for Disney+.
A graduate of Boston University with a degree in PR and Mass Communications, Lewin explains, “I always knew I wanted to work in film and entertainment.” She recalls being mesmerized by the visual effects work she saw in Terminator 2: Judgment Day, “I remember watching it and I just couldn’t understand how T-1000 walked through the bars in the psychiatric hospital. I was so taken by what I had seen that I watched all the behind the scenes and that’s where I learned about ILM, Dennis Muren and what his team had created.”

In 1994, she was hired as a temporary assistant in ILM’s purchasing department. “The job consisted mostly of filing purchase orders,” she recalls. “But it was my first real exposure to filmmaking and visual effects at the same time. All the brilliant people and incredible projects at ILM hit the sweet spot for my interest in production, innovation, and the business of filmmaking.”

Lewin spent the next two decades of her career at ILM working her way up the ranks to ultimately become Global Head of Production in 2010. In 2013, she moved to Lucasfilm to oversee Visual Effects for the newly rebooted Production studio, at the same time serving as the overall visual effects producer on all of the new Star Wars films, collaborating with directors such as JJ Abrams, Rian Johnson, Gareth Edwards, and Ron Howard over the next seven years. “It was a fantastic opportunity for me to partner directly with filmmakers and gain studio-side knowledge and empathy for that side of the coin,” she mentions.

That experience, combined with her vast tenure at ILM, positioned her well to take on this new adventure as GM at ILM, partnering with ILM Chief Creative Officer, Rob Bredow, to run the global organization. “I’m mostly excited about the incredible talent we have at ILM, the innovative StageCraft technology, our entree into episodic work with our amazing television division, ILM TV, and the diverse content on the horizon – not only from Lucasfilm, but from all of our clients. We are in a unique position to push the boundaries of what’s possible in real-time visual effects, immersive entertainment, and animated features, while we continue to innovate and grow our capabilities with regard to our traditional effects work.”

On her collaboration with Bredow, Lewin says, “We work just like a visual effects producer and supervisor but on a much larger scale. I focus more on how to execute the business and shows successfully while Rob’s focus is more on innovation and technology. We have a similar aptitude for driving projects and passion for the business that overlap in both areas, so combined with our different experiences and styles, that makes for a great partnership.”

“So often ILM is on the bleeding edge in terms of developing technologies that go on to change how stories can be told — and never has that been more true than with StageCraft,” explains Lewin. Originally developed with inspiration from Jon Favreau for The Mandalorian, ILM StageCraft is a suite of virtual production tools that encompass all aspects of production from design, scouting, and previsualization in the virtual art department to principal photography on ILM’s StageCraft LED volumes. The system proved to be a gamechanger on season one of The Mandalorian and since have been used on feature films, music videos and commercials. 

ILM isn’t resting on its laurels, the team took everything they learned on season one and combined that with 45-years of filmmaking and visual effects experience and reengineered StageCraft from the ground up for season two of the series. “We identified all of the shortcomings in the system and areas where we needed more flexibility and enhanced functionality, designing StageCraft 2.0 with filmmaking and production needs at its core.” notes Lewin. 
Lewin credits many of her role models and mentors, including Lynwen Brennan, General Manager, Lucasfilm, for setting great examples. “I’ve kept an eagle eye on the way Lynwen leads, how inclusive she is and how unflappable. She is always approachable and makes people feel welcome… a real creative problem-solver and I admire that.”

With a Player-Coach leadership style, Lewin explains, “I like to be part of solving problems, being in the trenches and supporting my teams so they can do their best. I don’t communicate a broad vision and then expect everyone to just figure it out.” Lewin continues adding, “I’ve grown into someone who tries to be curious as a leader, really engage with stakeholders and I try to inspire the people who are going to be the ones to make the change.” She makes clear, “I do have a strong point of view but I want to also be open, and allow the best idea to get elevated.”

Lewin, who feels strongly about bringing on a diverse workforce and creating an inclusive environment, is also a part of Lucasfilm and ILM’s Diversity, Inclusion, and Belonging team, helping executive initiatives both internally and externally. “The work we are doing through our employee resource groups is providing more connection points to different people within different communities. It really benefits the work we do – to have a welcoming and inclusive environment for diverse storytellers, production, creatives, and executives – it makes us bring our ‘A’ game because we all feel seen and recognized as individuals.” Lewin adds, “We care for each other and want to make sure everyone is thriving.”

One thing is clear, with Lewin and Bredow at the helm, we’re sure to see great things from the company for many years to come.  

Concept Artists Christian Alzmann, Brett Northcutt, and Stephen Todd were among the artists to receive a 2020 Concept Art Award on Saturday, September 12th. Additionally, Alzmann was honored with the LBX Concept Art Luminary Award, recognizing the impact of his work on The Child for The Mandalorian.
Check out the winning artwork alongside each honoree’s reaction.

2020 LBX CONCEPT ART LUMINARY AWARD: CHARACTER, LIVE-ACTION SERIES 
The Child by Christian Alzmann 
Star Wars: The Mandalorian

“It’s very humbling to be recognized in an industry that is producing so much amazing artwork. I did not anticipate that The Child would be loved by so many and I’m so happy that he has been a bright spot for fans over the last year. I look forward to seeing more art and artists honored at The Concept Art Awards in years to come.” – Christian Alzmann

ENVIRONMENT, LIVE ACTION FILM AWARD
Death Star Wreckage Duel by Brett Northcutt 
Star Wars: The Rise of Skywalker

“I am humbled to have won the Live-Action Film Environment award from the Concept Art Association. I have dedicated more than half of my life to imagining environments for movies and to receive an award directly from some of the top concept artists in the industry is truly an honor.” – Brett Northcutt


VR KEYFRAME AWARD
Windfall Crash by Stephen Todd 
Vader Immortal: A Star Wars VR Series

“Whoa! What an honor to even have my work voted on by my peers and judges whom I hold in high regard, let alone to receive the award. Thank you to the Concept Art Association and everyone who voted! Thank you to the ILM team who made this possible. I would not be here without all the help of my peers in the Art Department. Congrats to all the other winners!” – Stephen Todd 

Congratulations to all the 2020 Concept Art Award finalists! 

Industrial Light & Magic today announced the next phase of its global expansion plan for the company’s virtual production and StageCraft LED volume services. This expansion of services is tied to a proactive initiative for increasing diversity in the industry by combining ILM’s growth in this innovative methodology with a global trainee program geared for underrepresented VFX talent.

ILM’s existing StageCraft volume set at Manhattan Beach Studios (MBS) was used for the Emmy nominated series The Mandalorian and will soon be joined by a second permanent StageCraft volume set at the studio, servicing a variety of clients in the greater Los Angeles area. In addition, ILM is building a third permanent StageCraft volume at Pinewood Studios in London, and a fourth large-scale custom volume at Fox Studios Australia to be used for Marvel’s highly anticipated feature Thor: Love and Thunder directed by Taika Waititi. ILM will also continue to provide “pop up” custom volumes for clients as the company recently did for the Netflix production The Midnight Sky, directed by George Clooney.

An end-to-end virtual production solution, ILM StageCraft is a production-hardened technology that provides a continuous pipeline from initial exploration, scouting, and art direction, traditional and technical previsualization, lighting, and of course, real-time production filming itself, with the innovative StageCraft LED volumes. Lucasfilm’s hit Disney+ series, The Mandalorian, and a highly anticipated feature film took advantage of the full complement of ILM StageCraft virtual production services. Other projects such as Avengers: Endgame, Aquaman, Jurassic World: Fallen Kingdom, Battle at Big Rock, Rogue One: A Star Wars Story, Kong: Skull Island, Solo: A Star Wars Story, Ready Player One, and Rango, have utilized aspects of the toolset as well.

By every measure, the new stages are vast improvements over the original ground-breaking LED volume developed for the first season of The Mandalorian in 2018. Physically, the new stages are larger, utilizing substantially more LED panels than ILM’s original stage and also offering both higher resolution and smooth wall to ceiling transitions – this directly results in better lighting on set as well as many more in-camera finals. ILM’s proprietary solutions for achieving groundbreaking fidelity on the LED walls at scale allows for higher color fidelity, higher scene complexity, and greater control and reliability.

“With StageCraft, we have built an end-to-end virtual production service for key creatives. Directors, Production Designers, Cinematographers, Producers, and Visual Effects Supervisors can creatively collaborate, each bringing their collective expertise to the virtual aspects of production just as they do with traditional production,” explained Janet Lewin, SVP, GM ILM. Rob Bredow, CCO, ILM added “Over the past 5 years, we have made substantial investments in both our rendering technology and our virtual production toolset. When combined with Industrial Light & Magic’s expert visual effects talent, motion capture experience, facial capture via Medusa, Anyma, and Flux, and the innovative production technology developed by ILM’s newly integrated Technoprops team, we believe we have a unique offering for the industry.”

Alongside the new stages, ILM is rolling out a global talent development initiative through the company’s long-standing Jedi Academy training program. The program, which is part of the company’s larger Global Diversity & Inclusion efforts, offers paid internships and apprenticeships on productions with seasoned ILM Supervisors and Producers who serve as mentors. The program is intended to fill roles across the virtual production and VFX pipeline with those from traditionally underrepresented backgrounds; ILM has posted expressions of interests for jobs across the spectrum, from virtual art department teams and production management to engineering and artist roles. The goal with this initiative is to attract diverse junior talent and create a pipeline for them to become future Visual Effects artists, technicians, and producers who will be “ILM trained” and uniquely qualified to work in this new, innovative way of filmmaking.

“There is a widespread lack of diversity in the industry, and we are excited to leverage our global expansion in this game-changing workflow to hire and train new talent, providing viable, exciting, and rewarding jobs across many of our locations,” noted ILM VP, Operations, Jessica Teach, who oversees the company’s Diversity and Inclusion initiatives. “We believe this program can have a multiplier effect, attracting even more diverse talent to the industry and creating a pipeline for visual effects careers. We know that bringing more diversity into the industry is a critical part of strengthening and expanding our storytelling potential.”
ILM expects to have the new stages up and running for production in London in February of 2021 and in Los Angeles in March, with a mix of projects from features to commercials in line to take advantage of them. The company is currently fielding inquiries for future bookings by studios and filmmakers. For more information or to express interest in the Jedi Academy program visit our careers site.

We are thrilled to report that Rob Bredow, Executive Creative Director & Head of Industrial Light & Magic, is among the six newly elected members to join the Academy of Motion Pictures Arts and Sciences’ 2020-2021 Board of Governors. Additionally, Jessica Teach, San Francisco Executive in Charge, and 12 individuals from our artist and engineering ranks have been invited to join The Academy.

“I’m supportive of many of the positive changes that have been made within the Academy recently, and I want to focus on continuing this positive momentum,” said Bredow, adding, “I believe the VFX Branch will be stronger if we are more inclusive. A number of brilliant and accomplished visual effects experts are joining the Academy, including a few from ILM who I am excited to work alongside in this capacity. There is still much to be done. I’m excited to be joining the board of governors in this time of change.”

Bredow, who helped start the Academy Software Foundation and serves as chair, is also passionate about sharing  “I think as leaders in our field, we have the honor and responsibility to share what we’ve learned for the next generation of filmmakers.”

The 2020 Academy Invitees from ILM:
MEMBERS-AT-LARGE
Jessica Teach

VISUAL EFFECTS BRANCH
Jon Alexander – Avengers: Age of Ultron, Noah
Tami Carter – Star Wars: The Rise of Skywalker, Lucy
Karin Cooper – Star Wars: The Rise of Skywalker, Kong: Skull Island
Ryan Church – Transformers: The Last Knight, Avengers: Age of Ultron
Leandro Estebecorena – The Irishman, Kong: Skull Island
Stephane Grabli – The Irishman, Jurassic World: Fallen Kingdom
Douglas Moore – 12 Strong, Ant-Man
Nick Rasmussen – Ready Player One, Star Wars: The Last Jedi
David Seager – Aladdin, Terminator: Dark Fate
Amy Shepard – Playing with Fire, Doctor Strange
James Tooley – Star Wars: The Rise of Skywalker, Teenage Mutant Ninja Turtles
Paige Warner – Terminator: Dark Fate, Pirates of the Caribbean: Dead Men Tell No Tales

For a full list of the 2020 Academy Invitees, click here and for a full list of the current 2019-2020 Academy governors, click here.

Starting May the 4th viewers have been treated to a closer look at the groundbreaking technology at work behind the scenes of The Mandalorian and more when Disney+ pulls back the curtain on the live-action Star Wars series with Disney Gallery: The Mandalorian. The eight-episode docuseries, hosted by creator and executive producer Jon Favreau, promises insightful commentary from actors including Pedro Pascal and Gina Carano, anecdotes from the directors who helmed episodes in the first season, ILM’s visual effects team, and an exploration into what it means to be a part of telling stories in the Star Wars galaxy and honoring George Lucas’s legacy.

Check out the official trailer for Disney Gallery.

Industrial Light & Magic (ILM), and Epic Games (maker of the Unreal Engine), together with production technology partners Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI unveiled a new filmmaking paradigm in collaboration with Jon Favreau’s Golem Creations to bring The Mandalorian to life. The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using real-time game engine technology and LED screens to represent dynamic photo-real digital landscapes and sets with creative flexibility previously unimaginable.

Over 50 percent of The Mandalorian Season One was filmed using this ground-breaking new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20’ high by 270-degree semicircular LED video wall and ceiling with a 75’-diameter performance space, where the practical set pieces were combined with digital extensions on the screens. Digital 3D environments created by ILM played back interactively on the LED walls, edited in real-time during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by NVIDIA GPUs. The environments were lit and rendered from the perspective of the camera to provide parallax in real-time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Jon Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, and cinematographers Greig Fraser and Barry “Baz” Idoine, and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve real-time in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of partners such as Golem Creations, Fuse, Lux Machina, Profile Studios, and ARRI together with ILM’s StageCraft™ virtual production filmmaking platform and ultimately the real-time interactivity of the Unreal Engine platform.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of real-time, in-camera rendering,” explained Jon Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working towards using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” said Rob Bredow, Executive Creative Director and Head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real-time on stage providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Richard Bluff, Visual Effects Supervisor for The Mandalorian added, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

Today, ILM Senior Compositor, Charmaine Chan explains her visual inspiration for the Kylo and Luke confrontation after the Battle of Crait in Star Wars: The Last Jedi. 
Kylo Ren
These days, majority of blockbuster films are shot in front of a giant green/blue screen. Taking an actor in front of a screen and grounding them into a scene to help promote their development and storyline is always a tough challenge. On Star Wars: The Last Jedi, I got tasked with such a scenario, specifically the scene where Kylo and Luke finally confront one another after the Battle of Crait. It’s a tense moment and I was assigned to handle the Kylo Ren shots under the direction of VFX Supervisor Eddie Pasquarello and my sequence lead Peter Demarest.
The scene we were set out to create was right after a giant battle has just occurred, the land was all torn up, ash and salt flying everywhere, and we needed to create a sense of intensity, passion, and rage from the duality built up between Kylo and Luke.
Kylo Ren
The first time I saw the plates they reminded me a lot of the sparring sequence from Kurosawa’s Seven Samurai. It actually happens to be the 65th anniversary of Seven Samurai this year, so its only appropriate to give a nod to the film and cinematography that has influenced so many other films, including this one!

Toho Productions


The main thing I noticed within Seven Samurai is the simplicity and stillness of each shot and angle. Yet the way the subject is framed leaves us in anticipation of their next move. We wanted to do the same within our shots, where the backgrounds were not distracting us from focusing in on the two main actors.
We had the AT-M6’s in the back at a complete stand still waiting for orders, and a dramatic landscape that was subtlety changing as time passes by during their exchange. We represented this change via two major factors: the sun setting and the buildup of salt over the red kyber crystal floor.
Kylo Walking
When Kylo first approaches Luke, we set the backdrop very warm with saturated orange and reds and a strong contrast. We went through many iterations of both the sky color and FX animation of smoke/dust wisps over the floor. We then slowly transitioned the sky color to be less saturated, and used the animated smoke to create even more salt on the ground. By the time Kylo is ready to fight Luke, we’ve created a very different aesthetic that’s both cool toned and diffused.
The Last Jedi
What transpired next is probably one of my favorite saber duels, and being able to help set the mood right before that fight was a great refresher on visual storytelling.

OpenEXR, a widely-adopted HDR image file format, and OpenCue, a recently launched render manager, join the growing roster of Academy Software Foundation projects.
We’re thrilled to announce that the Academy Software Foundation (ASWF), a neutral forum for open source software development in the motion picture and media industries, today announced that OpenEXR and OpenCue have been accepted by the Technical Advisory Committee (TAC) as Academy Software Foundation projects alongside OpenVDB and OpenColorIO.
Initially developed by ILM, OpenEXR is an Academy Scientific and Technical Award winning high dynamic-range (HDR) image file format for use in computer imaging applications. It is a widely-adopted standard in computer graphics for linear and interactive media.
OpenCue is a fully featured, open source render manager for media and entertainment that can be used to break down complex jobs into individual tasks. Developed in collaboration by Google Cloud and Sony Pictures Imageworks, OpenCue is an evolution of Sony’s internal queuing system, Cue 3.
“This announcement marks a new phase for the Academy Software Foundation. We’ve achieved our initial goal of accepting OpenVDB, OpenColorIO, and OpenEXR – projects which greatly influenced the Foundation’s formation – and we are now ready to support and drive collaboration around newer projects like OpenCue,” David Morin, Executive Director of Academy Software Foundation. “Studios and developers are finding value in having a neutral home for the open source projects that our industry relies on, and we look forward to growing our projects and continuing to find new ways to support to the broader open source community.”
L3 and Lando
OpenEXR and OpenCue join OpenVDB and OpenColorIO as projects in the incubation stage at the Academy Software Foundation. All newly accepted projects start in incubation while they work to meet the high standards of the Academy Software Foundation and later graduate to full adoption. This allows the Academy Software Foundation to consider and support projects at different levels of maturity and industry adoption, as long as they align with the Foundation’s mission to increase the quality and quantity of contributions to the content creation industry’s open source software base.
Cary Phillips, Lucasfilm Research & Development Supervisor and Academy Science and Technology Council member noted, “The Academy Software Foundation was created with OpenEXR in mind, recognizing that there’s a natural life cycle to software projects: original architects and developers move between companies, expertise spreads throughout the industry, and the entire VFX technology ecosystem rapidly evolves. The ASWF has brought together virtually every major company in the industry, and it provides a vital forum to discuss sensible, practical solutions that should ensure that OpenEXR continues to serve the industry as a stable and reliable standard.”
OpenEXR
One of the foundational technologies in computer imaging, OpenEXR is a standard HDR image file format for high-quality image processing and storage. It features higher dynamic range and color precision than existing 8- and 10-bit image file formats, and the latest version of OpenEXR supports multiple image compression algorithms, stereoscopic workflows, multi-part files and deep data.
“For us, the single most important thing we create are the images that we put on screen, and we’ve all come to trust the OpenEXR format with our most precious data. ILM’s decision over 15 years ago to make EXR available as an open source project for the filmmaking community arguably set in motion an industry-wide trend that fostered collaboration and shared advancement, eventually culminating in the creation of the Academy Software Foundation. We’re proud to contribute OpenEXR to a new home to ensure it remains a robust and stable project for years to come,” said Francois Chardavoine, Head of Production Technology, Industrial Light & Magic.
Wakanda
OpenEXR was developed in 1999 by ILM in response to the demand for higher color fidelity in the visual effects industry. It was released to the public as an open source library in 2003, and it has since been widely-used and maintained through code contributions from companies including Weta Digital, Walt Disney Animation Studios, Sony Pictures Imageworks, Pixar Animation Studios, Autodesk, and DreamWorks, among others. OpenEXR was honored with an Academy Scientific and Technical Award in 2007.
OpenEXR is ILM’s main image file format and has been used in all motion pictures that ILM contributes visual effects work to since 2000. The first movies to employ OpenEXR were Harry Potter and the Sorcerer’s Stone, Men in Black II, Gangs of New York, and Signs. Recent films include Solo: A Star Wars Story, Avengers: Infinity War, Black Panther, and Star Wars: The Last Jedi.
Developers interested in learning more or contributing to OpenEXR can visit the OpenEXR Github page.

Today our guest writer is Todd Vaziri, Lead Artist at ILM who chronicles how the Blockade Runner engine shot from Rogue One: A Star Wars Story went from idea to reality:

I was thrilled to get to work on this shot with my friend and frequent collaborator, ILM lighter Tom Martinek. (Leia’s Blockade Runner escapes, tying Rogue One directly to the start of Star Wars (1977)? Yes, please!) We loved bringing this moment to life. It was a thrill to be able to help create the updated look of a classic ship we haven’t seen on screen since 1977. Also, it’s fun to realize that pretty much no one agrees how to pronounce “Tantive IV.”

Our first task was to study those first fleeting glimpses of the Tantive IV from the original Star Wars. Replicating the engine look of the engines *precisely* from the first film would not work for our movie. This was a recurring theme for the design challenges we took on for Rogue One.

Smoke Blockade Engine - Rogue One

Blockade Engine Distance - Rogue One

I created the Blockade Runner ‘engine look’ to appear the way you *think* you remember it from Star Wars, not the way it actually appeared — honoring the spirit of the original look and updating it to fit modern sensibilities and the stylistic signature of our new film.

First, I matched the hue of the engine glow from the original film. From there, I wanted to add an organic “jet engine” texture to the inside of each engine, so I rotoscoped and stabilized some footage from a Bell 209 helicopter engine, which had a lot of built-in dynamic energy.

I placed the texture inside the engine geometry of each of the eleven engines so we could get peeks at it when looking down the tunnel, and offset and rotated the helicopter engine footage for each engine (so each engine would have an unique energy signature).

Tom developed a flickery cucoloris effect to create the interactive light from the engine cast onto the inside of the chamber–I split that into 11 passes to animate them separately. Then I had to come up with a way for the engines to ignite as if from a cold start.

Blockade Engine - Rogue One

I knew we never saw a Blockade Runner power up in any of the movies, but I asked Pablo Hidalgo (Lucasfilm) and others to see if there was any precedent set in any of the animated series. Apparently, there was none! So, I thought it would look cool if the four corner engines fired up first for stability. Then the other seven engines followed up behind. I didn’t want the shot to become a big lens flare show, so I only had a few crisp flares peek through (taking my cues from the original trilogy X-wing engine flares).

This engine look became a quick-start setup for the other Blockade Runners you see in the film. Finally for this shot, I added a hopefully-subtle camera rumble as the engines ignited.

Radar Dish Rotation Examples.

We had a lot of fun talking about the rotating dish atop the Tantive IV. Look carefully at it in the original Star Wars (1977)–in shot 1, it’s not visible. In shot 2, it’s rotating counter clockwise. In shot 3, it’s rotating clockwise! For Rogue One, we animated the dish counterclockwise.

Rogue One (2016), visual effects by Industrial Light & Magic. Visual effects Supervisor John Knoll. Full ILM credits.

ILM’s Chief Creative Officer, John Knoll was among those honored by the Academy of Motion Picture Arts and Sciences at this year’s Sci Tech Awards.
Watch David Oyelowo present Scientific and Engineering Awards to Thomas Knoll and John Knoll for the original architecture, design and development, and to Mark Hamburg for his continued development and engineering of Adobe Photoshop:

Congratulations to John, Thomas, Mark and all of those honored this year.

Early this morning the Academy announced the nominees for the 91st Oscars and we’re so excited that ILM artists worked on three of the five films in the Achievement in Visual Effects Category.
AVENGERS: INFINITY WAR
Dan DeLeeuw
Kelly Port
Russell Earl
Dan Sudick
READY PLAYER ONE
Warner Bros. Pictures
Roger Guyett
Grady Cofer
Matthew E. Butler
David Shirk
SOLO: A STAR WARS STORY
Rob Bredow
Patrick Tubach
Neal Scanlan
Dominic Tuohy
Other nominated films:
CHRISTOPHER ROBIN
Christopher Lawrence
Michael Eames
Theo Jones
Chris Corbould
FIRST MAN
Paul Lambert
Ian Hunter
Tristan Myles
J.D. Schwalm
And here are some early reactions from a few of our newly-minted nominees:
Solo: A Star Wars Story
Rob Bredow, Overall VFX Supervisor – “We pretty much freaked out! All of us were gathered around the TV. It was pretty exciting. It’s such a great list of nominees, so it was amazing to see us on the list.”
Patrick Tubach, VFX Supervisor – “I’m incredibly excited to be nominated for the VFX work on Solo. Like many, I’ve been waiting to hear this story ever since some hot shot pilot bragged about it over a table in a seedy backwater cantina, but there’s a extra special thrill in being one of the lucky group of artists who finally got to tell it. Seriously- never tell me those odds!”
Ready Player One 
Roger Guyett, VFX Supervisor – “I’m really thrilled about the nomination – we worked so hard and had an incredible team working on the project. I’m sharing this with our huge crew who put so much love into this project. Its an honour and great recognition from our peers for the work we did on RPO.”
Grady Cofer, VFX Supervisor – “I’m delighted to hear that Ready Player One has been nominated for Best Visual Effects. It’s a testament to all the hard work that went into it. From day one this has been a dream project. Steven’s passion for the story, fueled by his endless creative energy, made it all possible. I am honored to join my fellow nominees and represent this film at the Oscars.”
Avengers: Infinity War  
Russell Earl, VFX Supervisor – “I’m just really happy to be nominated and feel lucky to be able to represent the whole team that worked on the film. It’s been a really exciting morning.”
The Oscars are February 24th. Congratulations to all the nominees!

In today’s employee spotlight we’re highlighting Color & Imaging Scientist, Carol Payne from our San Francisco studio.
(more…)

The Visual Effects Society released the nominations for the 17th Annual VES Awards this morning, and we’re thrilled to have been nominated for seventeen awards for our work in 2018. We’d like to congratulate all the nominees and thank our teams for their hard work.

Our nominations include:

Outstanding Visual Effects in a Photoreal Feature
Ready Player One
Roger Guyett
Jennifer Meislohn
David Shirk
Matthew Butler
Neil Corbould

Solo: A Star Wars Story
Rob Bredow
Erin Dusseault
Matt Shumway
Patrick Tubach
Dominic Tuohy

Outstanding Supporting Visual Effects in a Photoreal Feature
12 Strong
Roger Nall
Robert Weaver
Mike Meinardus

Bird Box
Marcus Taormina
David Robinson
Mark Bakowski
Sophie Dawes
Mike Meinardus

Outstanding Animated Character in a Photoreal Feature
Jurassic World: Fallen Kingdom; Indoraptor
Jance Rubinchik
Ted Lister
Yannick Gillain
Keith Ribbons

Ready Player One; Art3mis
David Shirk
Brian Cantwell
Jung-Seung Hong
Kim Ooi

Outstanding Created Environment in a Photoreal Feature
Ant-Man and the Wasp; Journey to the Quantum Realm
Florian Witzel
Harsh Mistri
Yuri Serizawa
Can Yuksel

Aquaman; Atlantis
Quentin Marmier
Aaron Barr
Jeffrey De Guzman
Ziad Shureih

Ready Player One; The Shining, Overlook Hotel
Mert Yamak
Stanley Wong
Joana Garrido
Daniel Gagiu

Solo: A Star Wars Story; Vandor Planet
Julian Foddy
Christoph Ammann
Clement Gerard
Pontus Albrecht

Outstanding Virtual Cinematography in a Photoreal Project
Aquaman; Third Act Battle
Claus Pedersen
Mohammad Rastkar
Cedric Lo
Ryan McCoy

Jurassic World: Fallen Kingdom; Gyrosphere Escape
Pawl Fulker
Matt Perrin
Oscar Faura
David Vickery

Ready Player One; New York Race
Daniele Bigi
Edmund Kolloen
Mathieu Vig
Jean-Baptiste Noyau

Outstanding Model in a Photoreal or Animated Project
Ready Player One; DeLorean DMC-12
Giuseppe Bufalo
Kim Lindqvist
Mauro Giacomazzo
William Gallyot

Solo: A Star Wars Story; Millennium Falcon
Masa Narita
Steve Walton
David Meny
James Clyne

Outstanding Effects Simulations in a Photoreal Feature
Avengers: Infinity War; Wakanda
Florian Witzel
Adam Lee
Miguel Perez Senent
Francisco Rodriguez

Outstanding Compositing in a Photoreal Feature
Jurassic World: Fallen Kingdom
John Galloway
Enrik Pavdeja
David Nolan
Juan Espigares Enriquez

Nominees in 24 categories were selected by VES members via events hosted by 11 of our Sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington. The VES Awards will be held on February 5th at the Beverly Hilton.

This morning BAFTA announced the nominees for their annual awards ceremony on February 10th and we’re thrilled that ILM worked on three of the five films that made the list!

Congratulations to our artists Russell Earl (VFX Supervisor, Avengers: Infinity War), Craig Hammack (VFX Supervisor, Black Panther), Roger Guyett (Visual Effects Supervisor, Ready Player One), Grady Cofer (VFX Supervisor, Ready Player One), and David Shirk (Animation Supervisor, Ready Player One).

Films ILM contributed to:

AVENGERS: INFINITY WAR
Dan DeLeeuw, Russell Earl, Kelly Port, Dan Sudick

BLACK PANTHER
Geoffrey Baumann, Jesse James Chisholm, Craig Hammack, Dan Sudick

READY PLAYER ONE
Matthew E. Butler, Grady Cofer, Roger Guyett, David Shirk

Also nominated:

FANTASTIC BEASTS: THE CRIMES OF GRINDELWALD
Tim Burke, Andy Kind, Christian Manz, David Watkins

FIRST MAN
Ian Hunter, Paul Lambert, Tristan Myles, J.D. Schwalm

We’d like to congratulate all the nominees on this honor and thanks to the British Academy of Film and Television Arts.

Ten films remain in the running in the Visual Effects category for the 91st Academy Awards, and we’re thrilled to have contributed to six of them.
All members of the Visual Effects Branch will be invited to view 10-minute excerpts from each of the shortlisted films on Saturday, January 5, 2019. Following the screenings, members will vote to nominate five films for final Oscar consideration.
ILM contributed to:
Ant-Man and the Wasp – ILM VFX Supervisor, Russell Earl
Avengers: Infinity War – ILM VFX Supervisor, Russell Earl
Black Panther – ILM VFX Supervisor, Craig Hammack
Jurassic World: Fallen Kingdom – Overall VFX Supervisor, David Vickery
Ready Player One – Overall VFX Supervisor, Roger Guyett
Solo: A Star Wars Story – Overall VFX Supervisor, Rob Bredow
Other Semi-finalists:
Christopher Robin
First Man
Mary Poppins Returns
Welcome to Marwen
Congratulations to all the semi-finalists!

This article originally appeared in The Hollywood Reporter

Nine technical achievements, represented by 27 individual award recipients including ILM’s John Knoll, will be honored at the Academy of Motion Picture Arts and Sciences’ Scientific and Technical Awards Presentation on Feb. 9 at the Beverly Wilshire in Beverly Hills.

Also during the evening, cinematographer Curtis Clark (The Draughtsman’s Contract), who chairs the American Society of Cinematographers’ Motion Imaging Technology Council, will receive the John A. Bonner Medal for service.

Scientific and Engineering Awards (Academy Plaques) will be presented to David Simons, Daniel Wilk, James Acquavella, Michael Natkin and David Cotter for Adobe After Effects; Oscar-winning VFX supervisor John Knoll, Thomas Knoll and Mark Hamburg for Adobe Photoshop; and Pixar’s leader Ed Catmull, Tony DeRose and Jos Stam for their subdivision surfaces science.

The recipients of the Technical Achievement Awards (Academy Certificates) are Eric Dachs, Erik Bielefeldt, Craig Wood and Paul McReynolds for the PIX System for distributing media; Per-Anders Edwards for the MoGraph toolset in motion graphics software Cinema 4D; and Paul Miller and Marco Paolini for their work on the Silhouette rotoscope and paint system.

Technical achievement awards also will be presented to Paul Debevec, Tim Hawkins and Wan-Chun Ma for the Polarized Spherical Gradient Illumination facial appearance capture method; Xueming Yu for the related Light Stage X capture system; and Charles Loop for his subdivision surfaces research.

Thabo Beeler, Derek Bradley, Bernd Bickel and Markus Gross are also being honored for the conception, design and engineering of the Medusa Performance Capture System. Medusa captures exceptionally dense animated meshes without markers or makeup, pushing the boundaries of visual fidelity and productivity for character facial performances in motion pictures.

“Each year, the Academy forms a diverse committee made up of nearly 60 experts on the technology of filmmaking tasked with examining the tools that artists use to create films,” Doug Roble, chair of the Scientific and Technical Awards Committee, said Wednesday in a statement. “This year, the committee is recognizing nine technologies from around the world. These extraordinary contributions to the science of filmmaking have elevated our art form to incredible new heights.”

Industrial Light & Magic expanding its offering of best-in-class visual effects and animation services to include the streaming and episodic television market with a new division: ILM TV. The division will be based out of ILM’s new 47,000 square foot London studio and supported by the company’s global locations in San Francisco, Vancouver and Singapore.

The ILM TV team will be lead by Visual Effects Supervisors Hayden Jones and Jonathan Privett alongside Executive Producers, Louise Hussey and Stefan Drury. Previously, the team set up and oversaw DNEG’s television division; winning a BAFTA for Special, Visual and Graphic Effects for their work on BLACK MIRROR.

ILM TV’s first projects will be Lucasfilm’s eagerly anticipated live action series based in the Star Wars universe, THE MANDALORIAN, being developed by Jon Favreau and Superman prequel series KRYPTON, now in its second season, based on DC characters from Warner Horizon Scripted Television for SYFY.

“It’s not often you get to create a new division at Industrial Light & Magic”, explained Rob Bredow, Executive Creative Director and Head of ILM, “We are seeing a real convergence in our creative approach used on films and in our immersive entertainment division ILMxLAB, and now we’re proud to be able to offer these ILM innovations in a way that’s suitable for streaming and television work to creatives around the world.”

ILM’s legacy in television dates back to the studio’s revolutionary and Emmy award-winning work for THE YOUNG INDIANA JONES CHRONICLES, which brought feature film quality effects to an episodic series for the first time.“We are extremely excited to be re-igniting ILM’s involvement in this market and to showcase the team’s expertise, unrivaled technology and production management globally.” adds VFX Supervisor Hayden Jones, “The television and streaming segments have grown exponentially in recent years and we are seeing substantial demand for high calibre visual effects that can be delivered on schedule and within budget, all of which lie at the core of our teams expertise and proven track record.”

ILM TV will offer producers and showrunners access to Industrial Light & Magic’s legendary VFX talent, infrastructure and technology combined with a fresh approach to visual effects, designed to suit the condensed production schedules and rapid turnaround times that episodic series and online streaming programs demand.

Join SVP, Executive Creative Director and Head of ILM, Rob Bredow for his keynote address from this year’s SIGGRAPH Conference in Vancouver. Bredow shares his unique understanding of how media and technological innovation can join forces to tell great stories and create groundbreaking experiences.

Watch:

TELL US ABOUT YOUR ROLE AT ILM, AND HOW LONG YOU’VE BEEN IN THE FILM INDUSTRY.
As CG Technology Supervisor of ILM, I work with the CG artist leads across our global studios, internal engineering groups, as well as with outside partners, to define and drive short and long-term technology strategy for ILM. I have 14 years of experience in the film industry as a vfx professional.

WHAT IS YOUR BACKGROUND? WHAT WAS YOUR MAIN COURSE OF STUDY IN SCHOOL?
I majored in Philosophy and Comparative Literature at Brown University, then went on to be a senior management consultant at a business consulting firm for several years, before switching careers into the vfx field. My technical and CG skills are all mostly self-taught with some combination of “as-needed” courses taken at NYU and UC Berkeley, and of course “on the job” learning once I got my first job in vfx at Rhythm & Hues working as a character rigger.

WHAT INSPIRED YOU TO GO INTO VISUAL EFFECTS?
I had always had a love of film and art and technology. An introductory 3D computer graphics class I took at NYU really made me realize how much I enjoyed CG work – both for the creative and technical aspects.

WHAT WAS THE MOST CHALLENGING POINT IN YOUR CAREER AND HOW DID YOU RISE ABOVE IT AND PERSEVERE?
The biggest challenge for me was switching careers from business consulting to vfx. I had a liberal arts degree that had very little to do with computer science or computer graphics, and all of my professional experience to date had been in an unrelated field. I went about learning as much as I could as broadly as I could about the vfx field, and most importantly, I formulated my own “learning path;” I managed to cherry-pick a few classes that I felt would expedite my learning, and filled in the rest with just learning on my own time. There are so many publicly available resources for learning both programming as well as computer graphics that it is really possible to teach yourself and get from point A to B quickly without enrolling in a lengthy, costly program or having some official certificate.

DID YOU HAVE SPECIFIC MENTORS OR ROLE MODELS THAT HELPED PUSH YOU FORWARD?
My mentors on the artist and technical side have all been men, but they have all been exceptional in giving me advice, development opportunities and encouragement. My most powerful female role model is my mother, who had a very successful professional career as a designer and is the artist I most admire. Art is an essential part of her being and way of living, but she has also always pursued her passion with incomparable moral integrity. Just as valuable to me is the role my father played in my upbringing. He also worked as a designer, but was very involved as a parent. I have many fond memories of him being involved at my school and taking me to and from my Japanese Saturday school, and teaching me things and being creative together. I think for opportunities for women in society to really change, girls need to see strong, successful women thriving in their chosen professions, but they also need to see that if they choose to raise children with a partner, that partner can be supportive and complementary of their needs. I think that paradigm shift can be liberating for many men as well, whose societal roles and expectations may in some cases constrain them from having richer relationships with their children.

WHAT’S YOUR FAVORITE MOTIVATIONAL MANTRA?
I do not have a particular mantra, but my children are all the reminder I need to give my best professionally and personally so that I can be the parent they deserve, and do my part to leave the world a better place for their generation.

HOW DO YOU THINK THE FILM INDUSTRY CAN BETTER ENCOURAGE GIRLS AND WOMEN OF ALL AGES TO GET INVOLVED IN FILMMAKING?
It needs to start very early – my own children even at 3 or 4 years old have noticed and asked why certain professions or activities are “all boys” – whether it’s something they noticed in a book or observing the real world around them. Whether they ask about it explicitly or not, those models are being reinforced (and therefore more likely to be replicated) from a very early age. I think providing opportunities to girls in school with filmmaking projects, classes, camps etc. is essential to getting more girls interested in a field that otherwise appears very homogenous and prohibitive. And I think telling the positive stories of female filmmakers and other women in the industry is also critical, to show girls and young women that it is possible to succeed and enjoy a career in this field.

WHAT ADVICE WOULD YOU GIVE TO WOMEN CONSIDERING FILM, AND SPECIFICALLY VISUAL EFFECTS, AS A CAREER CHOICE?
It is a continually evolving field so it is important to be flexible, curious, and enjoy being a constant learner. Follow and pursue the kind of work that you truly feel joy and excitement doing.

WHAT ADVICE WOULD YOU GIVE TO SOMEONE WHO WANTS TO TAKE HER CAREER TO THE NEXT LEVEL?
Always listen inwardly to what it is you want to be doing and how you want to be growing and let that define what “the next level” is for your career at your own pace; what you want may not exist directly in that next box up on the org chart – it may be somewhere else. Or it may be something that is not a box at all, that you end up drawing up on your own! Career success would be being able to develop, hone and expand the ways in which you as an individual can uniquely create value, and feeling fulfilled in doing so. Don’t be afraid to find and even create the opportunities for yourself that meet those needs.

Industrial Light & Magic (ILM), a Lucasfilm Ltd. company, announced today respected industry veteran Rob Bredow has been appointed SVP, executive creative director & head of ILM. In addition, it was announced that Gretchen Libby, has been promoted to vice president, marketing & production. Bredow will be in charge of all of ILM’s 4 global studios and report to Lucasfilm General Manager Lynwen Brennan, and Libby will report to Bredow.

Bredow joined Industrial Light & Magic as a visual effects supervisor in 2014 and shortly thereafter was named vice president of new media and head of Lucasfilm’s Advanced Development Group. Bredow was instrumental in launching a new division, ILMxLAB, in 2015, combining the talents of Lucasfilm, ILM, and Skywalker Sound to develop, create, and release story-based immersive entertainment. In 2016, Bredow was promoted to CTO of Lucasfilm, overseeing technical operations and partnerships as well as the company’s technology roadmap. Currently, Bredow is serving as the visual effects supervisor and co-producer on Solo: A Star Wars Story directed by Ron Howard, which releases on May 25, 2018.

“I’ve been working very closely with Rob over the past two years on Solo,” says Kathleen Kennedy.  “I have witnessed his leadership skills and creative abilities first-hand and I’ve been extremely impressed. Filmmaking is often about problem solving and Rob comes to every challenge with a strong creative point of view and the ability to find the best solution every time. This and his business acumen make him an ideal candidate to lead ILM, which has always stood at the crossroads of technology and artistry.”

“I am thrilled that Rob is going to be leading ILM into the future.  He is the perfect fit for the role combining creativity, innovation and business savvy”, says Lynwen Brennan, Lucasfilm general manager. “He is also a wonderful leader who builds great, trusting relationships within the company and with the filmmakers and studios we work with.”

“I’m honored to take on this role for Industrial Light & Magic,” says Bredow. “From my involvement with the launch of ILMxLAB to supervising the visual effects team on the soon-to-release Solo: A Star Wars Story, I can honestly say the people at this company are some of the most passionate, creative, and dedicated people I’ve had the privilege of collaborating with. I’m thrilled to help guide ILM’s legacy of innovation and excellence on a global scale.”

Prior to joining ILM, Bredow was the CTO and visual effects supervisor at Sony Pictures Imageworks. He has worked on films such as Independence Day, Godzilla, Stuart Little, Castaway, Surf’s Up, Cloudy With A Chance of Meatballs, and many others.

Bredow is a member of the Academy of Motion Pictures Arts & Sciences (Visual Effects Branch) and the AMPAS Scientific and Technical Council and, in 2010, was nominated for a Visual Effects Society Award Outstanding Effects Animation in an Animated Feature Motion Picture.

Gretchen Libby started at ILM in 1997 as a production manager. A year later, she was promoted to associate visual effects producer for A Perfect Storm and then to visual effects producer on Star Wars: Attack of the Clones two years later. In her previous role, Libby had focused on the company’s global expansion, which included opening studios in Singapore, Vancouver and London, and was the key marketing point of contact for ILM’s clients. Libby’s focus will remain on client marketing, overseeing all global production and strategic relationships.  Prior to ILM, Libby worked in visual effects film production at Pacific Data Images in Palo Alto, Calif. and in visual effects commercial production in New York.

“Gretchen is a key member of our executive team and has been instrumental in numerous strategic initiatives in recent years, from marketing the studio to our global expansion,” noted Brennan. “She brings depth of production understanding that is prized within the company and continues to be invaluable to our clients.”

“In my 21 years at the company I have seen firsthand the tremendous impact ILM has had, and continues to have, on the industry,” says Libby. “I’m excited to help ILM continue to evolve as we take on new challenges and provide new and exciting ways for storytellers to share their visions.”
Libby is a member of the Producers Guild of America and formerly served on the board of directors of the Visual Effects Society of which she remains a member. She is also a member of Women in Film and has served as a producer on 29 feature films, eight of which received Academy Award® nominations for visual effects.

ILM Chief Creative Officer John Knoll took the stage at Apple’s Worldwide Developer Conference to share a cutting edge VR demonstration created by ILMxLAB and Epic Games utilizing Unreal Engine running on the newly-announced iMac computer. The demo showcased how realtime VR tools such as Unreal Engine running on powerful hardware are being used to enhance visual development in filmmaking.

Products such as the new iMac and iOS  11 will no doubt enable ILMxLAB to share incredible experiences with hundreds of millions of people around the world and that’s really exciting.

Cinefex Magazine issue 152 has made its debut and the latest and greatest incarnation of Kong graces the cover. The issue features the publication’s trademark in-depth coverage of 2 films for which ILM was the primary visual effects company Kong: Skull Island and The Great Wall.


Cinefex continues to be the periodical of record for the visual effects industry and remains an invaluable resource to filmmakers around the world. Kong: Skull Island was covered by Senior Staff Writer, Graham Edwards, while the feature story on The Great Wall was written by Editor in Chief, Jody Duncan.

Open Bionics collaborated with ILMxLAB and the ILM Art Department to create the next generation of bionic hands for young amputees. The ILM team provided design concepts for the Marvel, Disney and Lucasfilm’s Star Wars inspired bionic hands. Numerous design concepts were created and from them three designs were selected to move into production including: From the Marvel Universe, the Iron Man hand, inspired by Star Wars Lightsabers, the Star Wars Lightsaber hand and inspired by Queen Elsa from Disney’s Frozen, the Snowflake hand.
ILMxLAB Creative Director, John Gaeta, explained, “ILMxLAB is thinking about the remarkable potential for conceptual robotics design and what better way to embrace that then help bring some emotion and imagination to the type of robotics that can change people’s lives. It’s an area where science fiction is converging on everyday fact. We want to be part of that as the lines blur. According to Open Bionics there are an estimated 2 million hand amputees worldwide. Most have no prosthesis and due to the high cost associated with traditional robotic hands very few have access to those either. Open Bionics are changing that and by focusing on creating low-cost, open source 3D printed robotic hands they are starting a revolution that I hope ripples through the industry and make a real difference to those in need.”
Kids around the world are not just getting medical devices, they’re getting bionic hands inspired by their favorite characters. The Walt Disney Company is generously donating the time of its creative teams and providing royalty free licenses.