San Francisco News

50 Years | 500+ Film and TV credits | 135+ Awards

SINCE 1975

Members of ILM’s visual effects team discuss their cutting-edge approach to crafting a scene from the Emmy-nominated Skeleton Crew through their real-time rendering pipeline.

By Jay Stobie

Industrial Light & Magic’s visual effects capabilities have been synonymous with innovative approaches since the company’s inception, as ILM creatives regularly transform theoretical techniques into groundbreaking developments that emerge as everyday solutions. One such forward-thinking application involves ILM’s use of real-time rendering to present their work to visual effects supervisors and client filmmakers in an immersive fashion that allows them to immediately incorporate the feedback they receive.

This process was utilized in “Zero Friends Again,” the sixth episode of Star Wars: Skeleton Crew (2024-25), for a sequence depicting Fern (Ryan Kiera Armstrong) and Neel (Robert Timothy Smith) as they ascend a ladder above the snowy plains of the planet Lanupa. A roundtable of ILM team members, including real time principal creative Landis Fields, environment supervisor Andy Proctor, technical artists Will Muto and Kate Gotfredson, and lead compositor Todd Vaziri, joined ILM.com to chat about trying out a new real-time workflow to craft the depth-defying ladder-climbing sequence in Skeleton Crew.

(Credit: Lucasfilm & ILM).

An Interactive Overview

“In traditional visual effects, artists show their work in dailies as a 2D-rendered movie and get feedback,” says Andy Proctor. “They work on those notes, re-render it, and present it again, usually the next day.” However, real time permitted Proctor and his colleagues the chance to incorporate some of their earlier virtual production workflows to achieve an interactive review process. “I was in the virtual art department on Skeleton Crew, and we would do dailies where the key creatives would be in VR headsets while looking at these environments. To build the actual sequence, we had the entire thing set up in real time for the creatives to view on a normal screen.  All the plates were loaded, and the blue screen was keyed out in real time. It’s essentially the same as when the visual effects supervisor is looking at a regular 2D review, except the whole scene is live.”

The planet Lanupa environments for those Skeleton Crew reviews were drawn from material originally created for ILM StageCraft’s virtual production pipeline, as Todd Vaziri highlights, “There are extended sequences where the actors were filmed for this environment on the ground level that were built by our StageCraft crew. There were already rounds of art direction, design, and construction, and that had to be approved by the filmmakers before first unit filming began with the actors. They would get in-camera finals using all of our StageCraft LED technology and real-time rendering technology. All of that work, especially on the creative side, had been done.”

Roguish Roots

Looking back to the origins of these real-time elements, Proctor points to ILM’s proof-of-concept contributions to the creation of K-2SO in Rogue One: A Star Wars Story (2016), also overseen by Skeleton Crew production visual effects supervisor John Knoll, as “a technical milestone for real-time visual effects,” which had usually been reserved for games and interactive projects during that time period. “You skip ahead to Skeleton Crew, and now you’ve got much more of a crossover, because we’re using real time to design the environments and work out how they’re going to be shot.”

The Mandalorian’s (2019-23) season three finale followed Rogue One as the next benchmark on the path to the real-time process harnessed for the Skeleton Crew cliff climb. “[Skeleton Crew] represented the natural progression of working with [executive producer] Jon Favreau on The Mandalorian, because we were always pushing the boundaries,” Landis Fields notes, as ILM’s use of StageCraft’s LED-based volume prompted them to lean into virtual production techniques for a variety of different disciplines. “On the volume, you have a real-time environment, and it’s working for in-camera finals. So there’s already a step towards what you’re doing in real-time being what you’ll see in the final picture,” Proctor chimes in.

In The Mandalorian “Chapter 24: The Return,” ILM chose the astromech droid R5-D4’s descent into the cavern housing Moff Gideon’s (Giancarlo Esposito) secret Mandalorian lair to exercise the most recent real-time advances for the scene’s final pixel shots. “We had done that years ago on some of the K-2SO shots for Rogue One, but real time had changed a lot since then,” Fields adds. “On The Mandalorian, we were able to test integrating real-time visual effects reviews with [visual effects supervisor] Grady Cofer and [animation supervisor] Hal Hickel.” Instead of simply giving notes for changes that would be made at a later date, the supervisors could quickly see the impact of their requests for lighting changes and other alterations.

(Credit: Lucasfilm & ILM).

A New Scope

ILM’s ability to successfully demonstrate the viability of that real-time process was met with an immense wave of support, as Fields credits Jon Favreau, John Knoll, head of ILM Janet Lewin, and Lucasfilm’s Rob Bredow, for being strong proponents of continuing on the cutting-edge course. Nevertheless, Fields emphasizes that this approach was intended to be one of many tools on which they could draw, as the choice of which technique to pull from ILM’s ever-growing arsenal of production pipelines would always come down to “the right tool for the job.” While the majority of the StageCraft volume LED in-camera work for Skeleton Crew was done using ILM’s proprietary Helios renderer and engine, this particular sequence was an opportunity to also see where the use of real time could be pushed and leveraged in novel ways.

Perceiving The Mandalorian’s season three finale as a major real-time stepping stone, Proctor recalls that ILM elected to expand its use to an even greater extent, as he posits, “Now, we’re going to take a sequence and cut it in among live action that was shot in the volume and other traditional visual effects that are rendered offline. It has to match the other sequences and be as visually complex as everything else.”With this real-time production workflow firmly in place when ILM’s work on Skeleton Crew commenced, Proctor recalls the ladder-climbing shot from “Zero Friends Again,” saying, “We knew it was an important moment, because it establishes that Neel is scared. Kate Gotfredson was able to set the shot up in real time so we could do a dynamic height or vertigo wedge.”

This arrangement enabled them to consult with John Knoll and ILM visual effects supervisor Eddie Pasquarello in real-time, experimenting with a variety of elements, from pushing the background forward and away to tweaking the lighting. Speaking to the capacity to review several shots in a row in a single cut with per-shot interactivity, Vaziri adds, “We had a mini-editorial cut with works in progress. Being able to show an entire sequence to the visual effects supervisors and saying, ‘Yeah, this is how it’s going to look, but we can interactively move things around and instantly see in the context of the cut,’ that’s a game changer right there.”

Praise for the Process

The benefits of utilizing a real-time production pipeline are as diverse as the galaxy far, far away that ILM has built on screen. “Real time is very flexible. We were able to develop custom real-time compositing tools very quickly using blueprints, which allowed us to preview the live-action footage directly on top of our environments. With these tools, we could experiment with framing, set dressing, and lighting with immediate feedback,” Kate Gotfredson outlines.

Will Muto offers his appreciation for the interactive workflow, surmising, “You get more bites at the apple. The creatives are able to iterate more and hone in on what they want. That’s where the power is here.” Muto applauds ILM’s real-time process for its smoothness, continuing, “There were no huge surprises. We added tooling around our color pipeline to apply our shot grades within the real-time minicut, so we were certain that artists working real time were viewing plates in the exact same context that the compositors would be viewing downstream. The showrunners [Jon Watts and Christopher Ford], John Knoll, and Eddie Pasquarello all got what they wanted in extremely short turnarounds.”

Fields echoes the praise for the collaborative efficiency, relaying, “In the traditional pipeline, having a review is not just jumping on a call. There’s time that an artist has to dedicate to preparing material to review.” With real-time sessions, Fields divulges that his team can simply “throw together a meeting, and everyone joins the call. There was no prep other than that I had to be at my desk.”

(Credit: Lucasfilm & ILM).

ILM’s ‘Personnel’ Touch

Proctor articulates an unexpected advantage that has a habit of emerging from those video calls, when his team could share a screen and jump right into a real-time session. “You get more moments of serendipity. When someone is showing their work, you can see what they’re doing live and interact with it yourself. By doing that, we’d get these ‘happy accidents,’ where someone was playing around with the water shader and hit a parameter that made everyone say, ‘That looks amazing!’ because suddenly the water felt incredibly translucent. Those collective learning moments happen all the time, and it’s very difficult to get that any other way than in real time.”

The human element also factors into another attribute unique to ILM, as the unprecedented level of professional experience concentrated within the company’s ranks allows its personnel to maximize the real-time workflow’s potential. In terms of oversight, ILM’s senior staff have the expertise to recognize how their colleagues’ talents could be leveraged for optimum efficiency. “Where do you want the masters of these crafts to spend their time? Andy and I were very keen on paying close attention to who was doing what,” Fields remarks. “It’s not about everybody doing everything. That’s another part of working within the pipeline and being smart about the division of labor.”

Applying environments created for the volume in the real-time review process gave some ILM artists the chance to work across multiple stages of development, as well. Once Proctor had finished with the set design alongside his colleagues in the virtual art department, content creation supervisor Shannon Thomas oversaw the creation of the final environment used in the StageCraft volume. Digital artist Nate Propp contributed to both the volume build as well as the real-time work in post-production, for which Proctor then returned to help oversee.

“Not only was Andy familiar with the worldbuilding exercise that he had done,” notes Fields, “but he was familiar with it from the ground-level perspective.” When the real-time crew began on the ladder shots that looked down on Lanupa’s surface, they could rely on Proctor’s insights from his role in crafting that environment. “Andy knew what the environment was supposed to look like,” Vaziri agrees, indicating that his work as the lead compositor involved the important task of exposure balancing for the foreground which meant “a lot of rotoscoping, compositing for the extractions, a tiny bit of effects renders for some blowing mist that went through the environment, a lot of stock stuff that the compositors put in. From our perspective, we had to deal with very few renders overall, which I absolutely loved.”

The Legacy Ahead

“Back in the day, when they did K2, that was about the ‘if.’ Skeleton Crew wasn’t about if we could do it. We knew we could do it,” Fields summarizes. “To be able to see that sequence in the cut, scrub, and get valuable notes that are efficient with the time from the visual effects supervisors was huge. The review was essentially a three-dimensional, real-time composite set that we could move around.” Perhaps the greatest testament to the real-time process is the response its use garnered from individuals throughout the company, as Fields shares, “Everybody outside of our phase, the other departments downstream, started to see the value here.” This particular real-time workflow has joined ILM’s illustrious array of visual effects pipelines, becoming yet another evolutionary tool to be called upon when ILM is deciding which approach is best suited for the visual effects shot it is tackling that day.

(Credit: Lucasfilm & ILM).

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Read more about Star Wars: Skeleton Crew on ILM.com:

‘Star Wars: Skeleton Crew’: ILM’s Visual Effects Treasure Chest, From At Attin to Starport Borgo

‘Star Wars: Skeleton Crew’: ILM’s Visual Effects Adventure from the Observatory Moon to Lanupa and back to At Attin

ILM’s Enrico Damm, Paul Kavanagh, Stephen King, and Matt Middleton share details about ILM’s role in the director’s first theatrical release from the DC Universe.

By Jay Stobie


Written and directed by James Gunn, DC Studios’ Superman (2025) has leapt to the forefront of the cinematic superhero landscape, aweing audiences with a message that inspires optimism and hope with a side of introspection. Industrial Light & Magic played a crucial part in the visual effects that brought this uplifting story to life, collaborating with Gunn and production visual effects supervisor Stephane Ceretti on their quest to supply a fresh perspective on the legendary character.

ILM visual effects supervisor Enrico Damm (Rogue One: A Star Wars Story [2016], Ahsoka [2023-present]), ILM animation supervisors Paul Kavanagh (Star Trek [2009], Deadpool & Wolverine [2024]) and Stephen King (The Avengers [2012], The Batman [2022]), and CG supervisor Matt Middleton (Mission: Impossible – Dead Reckoning Part One [2023]; Alien: Romulus [2024]) joined ILM.com to chat about ILM’s work on Superman, from taking the lead on characters such as Superman, the Hammer of Boravia, and Ultraman to tackling a large section of the climactic final battle involving a Metropolis baseball field, a deadly interdimensional rift, and fan-favorite dog, Krypto.

Welcome to Metropolis

“I joined Superman early on during pre-production, when [visual effects producer] Susan Pickett and Stef Ceretti approached ILM to talk about how we could build a Metropolis that we could art direct in a real-time fashion,” Enrico Damm explains to ILM.com. “This allowed the production designer and director to creatively iterate until we achieved a layout that would build the foundation for the final, post-production asset. We had real-time sessions with Stephane, [production designer] Beth Mickle, and, occasionally, even James Gunn, where he would be able to direct changes in real time. We established a look that was shared with the previs companies and the other vendors on the show, which permitted us to have everybody start with the same asset.”

Imbuing Metropolis with the living, breathing atmosphere of an actual city without replicating a real-world location was extremely important to Gunn. However, since roughly 70% of the design was inspired by New York, Stephane Ceretti and Susan Pickett authorized ILM to undertake an excursion to gather material. “I spent a couple days in a helicopter over New York City, capturing reference photography,” Damm details. “At the same time, my colleague Dacklin Young was capturing backdrops.”

CG supervisor Matt Middleton would go on to rely on Damm’s city references, explaining, “Our environment team at ILM’s Sydney studio did a large amount of the Metropolis city build. We knew this asset would need to be shared between vendors, and it was strongly based on the New York City references. We built Metropolis in sections, including various hero sections for our work. We had hundreds of unique buildings, which gave Metropolis a great organic feel. That was driven by what Enrico had done before the shoot had started. Many times, we might only build 10 buildings and do variations of them, but on Superman, we utilized an enormous number of distinct buildings to avoid a procedural feel.”


On-Set Observations

Damm’s duties intensified even further once filming got underway. “I was on set for pretty much every ILM-related shot,” he says. “I made sure that ILM would get what was necessary for our work, and I assembled things that could help us, which weren’t part of principal photography. On the side, I gathered motion tests and scans of his cape and suit, essentially grabbing David and the on-set visual effects crew for an hour to film him in the suit and see how the folds move,” Damm elaborates. “During the shoots of ILM sequences, I met with David almost daily and, with client-side visual effects supervisor Stef Ceretti, we developed a specifically-designed scanning system to gather data for ILM FaceSwap training.”

Since David Corenswet portrays both Superman and Ultraman in the film, this technology was useful for the moments when the two characters face off with each other. “The system helped us capture David’s face as faithfully as possible, so we could train a system to replicate him and FaceSwap in those moments. Almost every day, we held sessions with David to go through various lines to ensure that we captured every nuance of his performance, so we would be able to replicate it digitally on his stunt doubles.”


Suiting Up with Superman

ILM’s crucial involvement continued off the set and into post-production. “The client put their faith in ILM and had us build Superman, the Hammer of Boravia, and Ultraman, three characters who we see frequently,” Damm notes. “We even had the pleasure to have David Corenswet visit ILM’s San Francisco studio. Being a huge Star Wars fan, David had quite a blast. We scanned him using our proprietary MEDUSA scanning system to recreate him on-screen, which included a full digital replica with muscle, bone, and cloth systems. We’re dealing with an ultrabeing flying at ultraspeed, so we did a great deal of cloth simulation on his cape, suit, and hair to portray an appropriate sense of speed.”

ILM’s animation supervisor, Paul Kavanagh, is based at the San Francisco studio, the hub that oversaw the entirety of ILM’s work on Superman. Speaking to the prevalence of ILM’s digital Superman replica, Kavanagh says, “A lot of the times when you see Superman flying along in his suit and cape with his hair fluttering, the only thing that wasn’t CG was his face. But everything we did was based off of the live-action shoot, and we were very faithful to it. We weren’t making up a whole new shot; we were simply enhancing what was filmed.”

Meanwhile, operating out of ILM’s Sydney studio, ILM animation supervisor Stephen King was brought onto the project at the beginning of post-production. “As an animation supervisor, my job is to collaborate with Enrico and Paul to make sure that we’re creating the vision that’s coming from James Gunn and Stef Ceretti,” King remarks. “I helped establish the movement of Superman. The animation department was responsible for enhancing David’s performances by taking away the sense of him being on the rig they had filmed him on. We made certain that his body performance didn’t feel like he was on a rig – that he was actually flying. When I think of Superman, I think of his incredible strength and his ability to fly, so we needed that to appear as real as possible.”

King praises the cooperation between ILM’s various departments, stating, “Our team at ILM’s Sydney studio was in charge of Superman, creating the digital double that would go hand-in-hand and blend seamlessly with David Corenswet’s performance. For shots where we had to do a fully digital version, we wanted to ground it in reality. Our simulation department took care of his cape in every shot that we worked on, making it move and feel authentic. In many of the flying shots, we had to add digital hair because hair is difficult to recreate on stage. It’s either completely flat and doesn’t move, or a fan is placed in front of the actor and affects their performance by causing them to squint. James Gunn entrusted ILM with the title character. Guaranteeing that Superman shone in our work was of the utmost importance for us.”


Boravian Brutality

Damm also hones in on David Corenswet’s hair, referencing his previous point about the bald cap utilized to film the battle against the Hammer of Boravia. “James Gunn wanted to approach that scene with visual effects to allow us to portray an appropriate amount of speed within the hair and sell how fast these beings are flying. In the Hammer of Boravia sequence, it’s all digital hair. It was a unique challenge because there’s no room for errors. If there’s something off, it would immediately break the illusion,” Damm asserts.

“In terms of the character itself, the Hammer of Boravia was essentially a hard-surface object,” Damm adds. “Since he’s wearing a suit, he was a bit easier than Superman in the sense that we weren’t dealing with flesh. The main challenge emerged when it came to texturing and shading the character, as there’s a significant amount of creative and technical know-how called for to craft the shading response that a metallic object has.”

Damm hopes that audiences are unable to tell which shots necessitated the Hammer of Boravia becoming a digital character, noting, “There was a full-on practical suit in many shots, where the on-set crew filmed him on wires. Certain action beats and acrobatic movements required he either be partially or completely replaced. Even in a handful of close shots, where you might assume the practical version remained, we had to go with the digital version because the story changed after principal photography had finished.”


The Nature of Narratives

As breathtaking as the visual effects of Superman are, Damm and King both emphasize that ILM’s contributions were all done in service to James Gunn’s compelling story. “There’s a sequence where Superman and Lois Lane [Rachel Brosnahan] are deep in conversation, but you have the Justice Gang fighting a giant jellyfish-type creature in the background. We played on the size of the creature so it would be subtle and not moving fast enough to be distracting,” King professes. “Then, when Superman tells Lois that he loves her, the creature spews out all these different colors, and it’s almost like fireworks that enhance the sense of their love and their connection to each other. It’s visual effects aiding in the storytelling, and that’s a credit to James and Stef knowing what they wanted.”

Similarly, Damm highlights the moment the interdimensional rift arrives at Metropolis and begins to split the city, pronouncing, “We were breaking buildings, and there were so many layers of destruction built on top of each other. However, all of that needed to hit precise story beats, meaning the effects weren’t just taking one building and letting it fall into another building. There’s a specific speed and cadence to it that was art directed by Stef and James. Our effects artists received very precise animations of how everything would collapse from the animation department, which were then used to drive simulations.”


A Kryptonian Canine

ILM handled a major portion of Superman’s final battle, as King describes, “We basically worked on everything from when they land in the baseball stadium until they exit the rift at the end of the fight between Superman and Ultraman.” The beloved dog Krypto is a key component in these sequences, and the director utilized his own dog, Ozu, as a template for the heroic canine. “In essence, James’s dog is Krypto. By that, I mean his dog is also very rambunctious and doesn’t necessarily follow the rules all the time. We had various shots where we animated Krypto, so we built muscle and fur systems to make his hair flow appropriately and match James’s dog,” Damm proclaims, mentioning that Framestore built Krypto’s underlying skeleton.

“The reference footage that James sent over was so fantastic, and [Ozu] was such a character,” agrees Kavanagh, who then turns to the shots themselves. “We received Framestore’s shots of the dog animation well before we started on ours, and they gave us a wonderful target to follow. In animation, we’re constantly paying attention to the little things. For example, when the dog’s foot plants, we’re looking at how deep the foot presses against the ground, the squish of the toe pads, the slight spread of the toes, and the angle of the nails. The same goes for how the dog pants and the way its tongue rolls over its incisors. These are all elements that make the character come alive.”

ILM’s contributions to Krypto were concentrated on the dog’s appearances in the climactic battle, and the team recognized how vital these sequences would be. “James puts a lot of thought and love into his digital characters and their performances,” says King, who jokes that he spent time staring at his own dog as part of his research. “Our Krypto sequences were based more on physicality, like when Krypto knocks over Ultraman and starts destroying all the drones, so we got some nice high-energy panting in there that feels very lifelike. As an animator and animation supervisor, it’s the subtle stuff you bring to the character that can make it more realistic, and that’s what we love to do.”


An Epic Engagement

The Engineer (María Gabriela De Faría) and Ultraman stand as other key factors in ILM’s battle scenes. “I loved our time on the Engineer, because she’s got the nanites that empower her to shapeshift and create nanite obstacles that she fights Superman with,” King relays. “It was important to stay true to what they did on set when Superman fought the Engineer, yet give it extra energy.” Comparing the Engineer to a more complex version of the liquid metal T-1000 from Terminator 2: Judgment Day (1991), Damm states that “the Engineer breaks herself down into nanites, so there are really millions of little individual objects that are coordinating to perform a function. Since we see her punching Superman from far away and also get a closeup of individual nanites on Superman’s eye, we constantly adjusted the size of the nanites.”

Damm was on set when Gunn filmed the characters’ engagement in the baseball stadium, recalling, “People were being pulled on wires and landing on mats for protection. While that provides a solid base, you can’t film at the speed that’s required or crash your star actors into the ground! Visual effects had to be added, especially for ground destruction. We all know what it looks like when you rip grass out of the ground. There are many layers to it, and we needed to represent how it separates in a believable way.”

Matt Middleton points to another facet of the “ground” battle, opining, “There was a lot of work in maintaining the continuity of the dug-up trenches. We had to accurately place specific trenches that the characters had previously skidded through into the background of the shots.”

“We thought a lot about ground interaction and how far to stick the characters into the baseball field to demonstrate the force and energy,” Kavanagh concurs. “And when that impact happens, it kicks up dirt, debris, and grass.” Pivoting to what he calls the “up and down” sequence in which Superman and the Engineer ascend into the atmosphere, Kavanagh says, “We had cloud layers for the characters to go through to get a sense of their speed. We also didn’t want to make it too easy for them to move within the heavy wind resistance. We always want to ground our sequences in reality.”

As a clone of Superman, Ultraman’s appearance is tied to the look of the titular hero in a multitude of ways. “Unlike in previous Superman films, Superman’s suit was a little looser,” Damm outlines. “James Gunn has explained that Superman’s mom made his suit, so it’s not something fashioned from super technology or sent from Krypton. Ultraman’s suit was also fairly loose, so we had to go the extra mile. Since we built it based on digital scans taken with David standing in a scanning studio, there were naturally folds present in the cloth. We rebuilt his suit in a way that allowed us to put effects simulations on it, enabling Ultraman’s suit to move properly in the wind and when he was being punched.”


Rumble in the Rift

The interdimensional rift that slices through the city represents another important feature in the third act’s big fight sequence. “It’s unique and almost a living organism,” Damm remembers. “Rick Hankins, who joined Superman early on as the effects supervisor, took on large chunks of that R&D project. We applied various elements into it, such as how metal melts and the crystalline growth of bismuth. We presented hundreds of versions and eventually found a look that was approved.” Matt Middleton adds, “The geometric detail that went into the bismuth was immense, and our goal was to achieve a look that people could believe in, which doesn’t look like a CG fantasyland. Also, the previs that was done by the client was exceptionally helpful because James Gunn knew how he wanted the broad geography of the sequence to come together.”

“In the third act, we spent quite a bit of time inside the rift,” notes Damm. “Being so close to it and having it around us the entire time proved to be very challenging. If the surface qualities of the rift don’t feel like believable metal, the entire sequence falls apart. To represent real bismuth in a meaningful way, we needed ours to have a believable, metallic nature to it, as well as an underlying sheen that goes through rainbow colors. Additionally, when the other dimension opens, and we see a black hole, there’s a ton of heavy effects simulation that goes into having assets breaking and being pulled into the black hole before disappearing.” Kavanagh cites ILM’s insertion of debris elements moving toward the black hole as being the foreground cues that keeps audiences oriented to which direction is “up” throughout that sequence.


An International Effort

As a global studio, ILM’s work on Superman – which Damm estimates to be in the vicinity of 560 visual effects shots – occurred around the world. ILM’s Sydney studio took on more than half the shots, while ILM’s studios in San Francisco, Vancouver, and Mumbai combined to handle the rest. “In Sydney, we worked hand-in-hand with San Francisco in our respective time zones. I would plan our sequences with [visual effects supervisor] Dave Dalley and Matt Middleton, then we’d get invaluable input from Enrico and Paul,” King explains.

“ILM’s San Francisco studio worked concurrently with the client’s time zone as they were based in Los Angeles,” Kavanagh adds. “The time difference can be tough because Sydney is a day ahead. It’s like you’re time-traveling [laughs]. However, it often worked to our benefit. We could give the Sydney studio feedback on a Friday, and by the time we came in on Monday morning, they already had a day to work on the notes and provided new takes for us to show to the client.”

On a grander scale, King interprets ILM’s international presence as a phenomenal sign for the company’s future, commenting, “I came to the Sydney studio when it opened in 2020, and we started with relatively small jobs. It’s exciting for us to have grown so much that our location can take on the end battle sequence of a big summer blockbuster like Superman. ILM opened up new studios in Sydney and Mumbai within the last six years, plus we have the more established studios in San Francisco, London, and Vancouver. ILM is growing, and the work is turning out to be magnificent.”


A “Super” Success

Considering the project as a whole, King is immensely satisfied with his team’s performance on Superman. “As animators, we put so much thought and effort into everything we do. It’s not just sitting at a computer. We use computers as tools, but we often go and shoot references for the shots that we’re working on, so we can study them. We want to inspire young people to have that same love for movies that we grew up with. ILM is the dream job for myself and many people in our industry. At ILM, no one is too small to give their opinion or voice an idea. We create works that stand the test of time, that we can look back on and be proud of.”

Matt Middleton proudly sees Superman as one of the most complicated projects he’s worked on, summarizing, “There were a number of different challenges, ranging from FaceSwap to stadium destruction, as well as the amount of detail that went into hand-cutting out all the little tears in Ultraman’s suit to match the on-set costume. The hundreds of buildings that went into the city, and the hero buildings that have to be built with internal beams that you can see as they get pulled apart. The complexity goes with the territory in superhero movies, but it’s an incredibly intense amount of intricate work spread across all of our departments.”

“lt’s a massive team effort, from the client side all the way down through to the artists and production, but the process of sitting in meetings to work out how we’re going to get the project done is fun,” Kavanagh discloses. “As a supervisor, I’m conscious of hitting deadlines, incorporating changes, and achieving the highest quality look for the client. I also want to make it enjoyable and interesting for the people who are working on the show. Together, we know we’re going to come up with some outstanding ideas and have a terrific time doing it.”

For Damm, one of his favorite moments from Superman encapsulates his appreciation for ILM’s role in the project. “There are plenty of occasions where Superman helps people in this movie,” Damm begins, “But when he saves the woman on the bridge from a falling building, it’s not just that he puts himself in danger to save one person. By showing how important it is for him to save a single individual at any cost, it demonstrates how human he truly is. Afterwards, when it all collapses, he heroically rises out of the ashes. Plus, that shot itself feels as if it’s right out of a comic, with the dust billowing to either side. It was very beautiful to see this shot come together and even more so to see the great reactions from the DC fanbase.”


Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

Go behind the scenes of ILM and LIMINAL Space’s ‘Real-Time Rocket’ activation at Disney Accelerator Day 2025.

By Patrick Doyle

Virtual production supervisor Christopher Jones (left) and creative director Michael Koperwas with Real-Time Rocket in the background.

At Disney Accelerator Day 2025, guests witnessed an extraordinary moment where the line between movie magic and real life all but disappeared. In a collaboration between Industrial Light & Magic and LIMINAL Space, attendees met a live, interactive version of Marvel Studios’ Rocket Raccoon, rendered and animated in real time and presented in fully dimensional 3D.

A Rocket Moment Like No Other

The activation, aptly named “Real-Time Rocket,” invited guests to step up to a large, glowing display, the kind of screen that instantly makes you curious about what’s about to happen. After donning a pair of polarized glasses, attendees found themselves face-to-face with Rocket Raccoon, who wasn’t just playing back a pre-recorded clip, he was there, moving, talking, and reacting live.

Rocket leaned forward, smirked and started chatting with the crowd. He cracked jokes. He called out a few attendees by their outfits. He laughed at his own punchlines. The energy in the room was electric as laughter, awe and a collective sense of “how is this even possible?” filled the air.

“What made Real-Time Rocket so special was seeing a beloved character come to life in a way that felt truly spontaneous,” said Christopher Jones, virtual production supervisor of Real-Time Rocket. “You could see the audience forget they were looking at a screen and, for a moment, they were simply having a conversation with Rocket. That’s the magic we aim for at ILM, where performance and technology meet to make something that feels real.”


Making the Impossible Possible

The “Real-Time Rocket” experience was powered by a creative combination of ILM’s real-time animation technology and LIMINAL Space’s Spirit Screen, an innovative display that uses polarized light to create a 3D effect without the need for headsets or holographic projection. Through this setup, Rocket appeared to step off the screen and into the same physical space as the audience, creating a shared moment of astonishment and delight.

“This activation was a perfect example of what happens when creative teams collaborate across disciplines,” said Alyssa Finley, Real-Time Rocket’s executive producer. “Working with LIMINAL Space allowed us to merge our real-time animation pipeline with an innovative display technology that gives audiences a live, shared, dimensional experience with a digital character that they’ll never forget.”


A Room Full of Wonder

As guests filed through the activation, the reactions were priceless. Attendees waved, Rocket waved back, and many leaned closer to see if there was a trick to it. Guests laughed when Rocket playfully teased them about their love of Baby Groot or asked if anyone had seen Star-Lord around. It was equal parts hilarious and jaw-dropping.

Throughout the day it was clear there was something unmistakably Disney happening. When Rocket waved goodbye at the end of each session, the sense of wonder lingered. For a few moments, the technology faded away, leaving only the feeling that audiences had truly met Rocket Raccoon in person.

“Standing in the room and watching people laugh and talk with Rocket was incredible,” said Michael Koperwas, creative director of Real-Time Rocket. “You could feel the energy shift the moment he appeared. Those moments of pure audience connection are what drive us at ILM to keep pushing the boundaries of what’s possible.”

The ILM crew – plus Real-Time Rocket – assembled on the stage.

For more on Real-Time Rocket, check out ILM’s “Behind the Magic” video from the D23 activation here:

Patrick Doyle is a senior publicity manager at Industrial Light & Magic.

ILM creative director David Nakabayashi, along with artists from ILM’s global studios, including Aaron McBride, Cody Gramstad, Bimpe Alliu, and Chelsea Castro, reflect on the essential role concept art and storyboarding play in the filmmaking process.

By Jay Stobie

Concept art for The Avengers (2012) by Aaron McBride (Credit: Marvel & ILM).

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and highlights from Industrial Light & Magic’s 50 years of innovative storytelling.

From envisioning a look for cursed pirates to plotting out space battles, Industrial Light & Magic has an unparalleled reputation for working wonders in collaboration with filmmakers to bring the stories they envision to life. Built on a 50-year legacy of talent and tenacity, ILM’s Art Department has grown into a global hub for generating and executing the ideas that immerse audiences in the worlds they see on screen. In this installment of ILM Evolutions, we’re heading back to the drawing board to focus on conceptual art and storyboarding, as this indispensable imagery fuels the creative process by visualizing a filmmaker’s ideas in the earliest stages of production.

ILM Art Department creative director David “Nak” Nakabayashi sat down with ILM.com to share his insights on the history of concept art and storyboarding, as well as his own first-hand knowledge of the craft. With an esteemed resume featuring iconic films such as The Hunt for Red October (1990), Jurassic Park (1993), Men in Black (1997), Star Wars: The Phantom Menace (1999), A.I. Artificial Intelligence (2001), Avatar (2009), Star Wars: The Force Awakens (2015), and many more, Nakabayashi now oversees art directors, illustrators, and artists across ILM’s global studios. Additionally, artists Aaron McBride (San Francisco), Bimpe Alliu (London), Cody Gramstad (Sydney), and Chelsea Castro (Vancouver) joined the conversation to highlight their careers and the latest developments in their field.

Concept art for Star Wars: The Phantom Menace (1999) by David Nakabayashi (Credit: Lucasfilm & ILM).

Ideas and Intentions

Concept art and storyboards each serve unique purposes in the production pipeline. “Concept art depicts a scene, setting, place, location, robot, spaceship, weapon – they initially come from the script along with a brief description,” outlines Nakabayashi, who emphasizes the extent to which the art helps the crew visualize the project they will be creating. “These are the key beats in the film. Concept art establishes what we’re going to be doing. We’re showing everyone that this is the movie we’re going to be making when absolutely nobody has any idea what it will look like.”

Nakabayashi cites concept artist Ralph McQuarrie’s contributions to Star Wars: A New Hope (1977) as the perfect example of such art having an inspirational impact on a film. “They based everything on his art. It fed everybody’s imagination.” Today, concept artists regularly assist filmmakers as they design and seek green lights for their films. “Sometimes, ILM will do development or spec work, where we take concept art and show the studio what the movie will be with the same intention as Ralph did. We carried that on.”

While concept art focuses on design, storyboards define the action that occurs on-screen. “Storyboards are all about the cinematic motion, the energy of a visual effects shot. That’s why ILM’s Joe Johnston was such a great foundation for this department. He would draw storyboards with arrows that would be compressed in perspective, and you really understood the depth of what he was trying to say,” Nakabayashi notes. Over time, the advent of digital animatics altered the use of storyboards. “We hardly storyboard anymore these days because animatics act as the filler, but it’s the same principle.”

From a broader perspective, Nakabayashi is quick to point to the artistic value of concept art and storyboards. “A pencil drawing, for me, is as powerful as a Ralph McQuarrie gouache painting. I get consumed by the techniques sometimes, and how a person can draw this perfect angle of a little spaceship cruising through the columns of some weird planet. To draw that sequence helps the director make decisions. It’s about visualizing and timing the film before they actually shoot anything, though it’s all changed quite a bit with the whole animatics tool set.”

Storyboards for Men in Black (1997) by David Nakabayashi (Credit: Sony & ILM).

The Importance of Art

“When you look at the scope of what ILM has done,” Nakabayashi says, “obviously Star Wars was a flashpoint for concept art and storyboards because that was the first way of getting creativity into the movie and bringing visual life into the script.”

Of course, the benefits extend far beyond what is seen on-screen. “Art is important for many other reasons,” Nakabayashi explains. “For ILM, it is also about the budgeting process. Historically, the model shop would look at storyboards and concept art and have an idea of what was coming. Production is very budget-driven. ILM would storyboard their sequences, not just for the artistic impression of it, but for logistics and production. That was how a director would communicate with the visual effects supervisor. ‘We’re going to shoot this practical and this blue screen. We can save a lot of money if we do this with miniatures.’”

Nakabayashi boils the work down to its essence, relaying, “It’s about the artists believing that the future is possible and the creation of the cliche ‘Show me something I’ve never seen before,’ which is sort of a byline we usually get from our clients. We can do that because we have the right people – people who take inspiration from the artists who came before them. That’s ILM’s culture of concept art and storyboarding. I’m not a great storyboard artist, but I can communicate and do the work. To me, that is the most important part – communicating the ideas.”

Concept art for A.I. Artificial Intelligence (2001) by David Nakabayashi (Credit: Warner Bros. & ILM).

Communicating Concepts

As an art director at ILM’s Sydney studio, Cody Gramstad (Sonic the Hedgehog 3 [2024]; Lilo & Stitch [2025]) affirms the significance of communication, stressing, “When it comes to being a concept artist, you’re not necessarily there to create artwork. You’re there to clarify and communicate ideas. My favorite part of the job is actually the conversation where I sit down with a bunch of creative people, brainstorm potential solutions to problems, and get everyone amped up as we figure out our creative direction. Painting and visuals are a part of that, but being able to talk, pitch ideas, and get people excited is one of the most important skill sets.”

Gramstad, whose parents were professional sculptors, takes the notion a step further, suggesting that prospective concept artists can bolster their abilities by balancing the dedication necessary to hone their craft with time off for real-life adventures. “Step away from your computer every now and then, have some experiences, meet people, and socialize,” declares Gramstad, placing value on the correlation between communicating ideas and relating to those around you. “It’s a lot easier to work with someone who has gotten out in the world and brings those stories into their artwork.”

Concept art for Lilo & Stitch (2025) by Cody Gramstad (Credit: Disney & ILM).

The ILM Influence

Turning to Industrial Light & Magic’s unique place in the history of concept art and storyboarding, Nakabayashi states, “ILM is special because it all sort of started here. It’s special because of the people who believed and put their foot down – Colin Cantwell, Ralph McQuarrie, and Joe Johnston. There were others on the outside, like Syd Mead and Ron Cobb, all these illustrators who were doing sci-fi stuff, but ILM was the first one to take the visual effects art department and make it something that everybody wanted to be.”

San Francisco-based senior art director Aaron McBride notes ILM’s post-Star Wars permanence as a standout achievement for the company. “Before ILM, visual effects departments would start up for the duration of a film and be temporary. When the film was over, everyone would get scattered. It was almost nomadic,” McBride mentions. “There was a demand for the work that ILM was doing, and ILM was able to advance technologies because it was kept intact.”

By the time of Star Wars: Return of the Jedi (1983), Nakabayashi explains that directors began approaching ILM for innovative films like Poltergeist (1982), The Goonies (1985), Cocoon (1985), Terminator 2: Judgment Day (1991), and many others. From crafting concept art that gave those films their “first breath of life” to “feeding production with ideas,” Nakabayashi views the ILM Art Department’s past with distinction. “Historically, we’ve had some of the best concept artists ever – Doug Chiang, Harley Jessup, John Bell, Terryl Whitlach, Christian Alzmann, and James Clyne. A lot of the artwork that they created determined whether or not a movie was made. That kind of talent, to me, is the greatness of the department.”

As the visual effects art director on The Phantom Menace, Nakabayashi saw connections between his work and that of his predecessors. “We had all this concept art, and part of my job was to bring it into the real world. That’s what the ILM Art Department has always been at the forefront of back to the days of Joe Johnston because he wasn’t even a storyboard artist when he started. He also got into the model shop and built models. He loved making miniatures and setting up the stage. It was kind of the birth of the visual effects art director. We followed along that path. It’s not just doing the drawing or coming up with an idea, it’s implementing it, as well.”


Executing the Ideas

Nakabayashi recalls his experience collaborating with director Barry Sonnenfeld on Men in Black II (2002). “I was tasked to take a trash can and turn it into a killer robot. I liked the idea that it opens up like a flower, and it comes with multiple gun turrets that are not necessarily normally situated in a standard military platform. Maybe it’s more like an orchid. With a few changes, the design went to computer graphics, and I helped develop it in dailies with the modelers, painters, and animators.”

Turning to his time on A.I. Artificial Intelligence, Nakabayashi posits, “Those worlds – Coney Island, the Rouge City, an underwater theme park – were all absorbed through storyboards that Chris Baker did with Stanley Kubrick for a couple years. We started with that as our inspiration, and then we started doing colored artwork – paintings, drawings, some storyboards for shot ideas – and pitched those to [visual effects supervisor] Dennis Muren and numerous other people. It became this whole world of miniatures, and it was also on the brink of the digital component coming into the workflow. There’s a real marriage of practical effects, which I will still say is the most fun to work on, with the digital component.”

Envisioning new worlds still requires references that ground them in reality. For The Phantom Menace, Nakabayashi saw a dry South Dakota riverbed as a perfect reference for the bottom of Naboo’s oceans, proposing a fresh take on how to approach the Gungan city to Dennis Muren. “I go, ‘What about a booming shot? You track over and dip down to see the top of the city as opposed to always looking up. We’re going underwater, right?’” Such insight and inspiration impressed director George Lucas. Nakabayashi touches on the Gungan shield that comes down on the battlefield, continuing, “I had this idea – it was a parasol and an umbrella, kind of like a sprinkler. George loved it.”

Concept art for Men in Black II (2002) by David Nakabayashi (Credit: Sony & ILM).

Turning the Tide

As is often the case with the work ILM tackles, changes manifested for the art department over the years. Nakabayashi indicates Adobe Photoshop – the editing software co-created by John Knoll, ILM’s current executive creative director and senior visual effects supervisor – as technology that revolutionized his field. ILM even dabbled with Photoshop during its earliest days. “With Death Becomes Her (1992), Doug Chiang took plates and drew the effect of Madeline Ashton [Meryl Streep] having a broken neck. He took pictures of people and we altered them into these effects-type things.”

Along with Photoshop’s availability, concept artists continued drawing with traditional tools like pencils, markers, and paper until ILM received the call for Pirates of the Caribbean: The Curse of the Black Pearl (2003). “The director, Gore Verbinski, was like, ‘These are great drawings, but I want to see what it looks like in my film. Don’t give me a pencil sketch,’” Nakabayashi says. The filmmakers wanted a desiccated – but not bloody or gory – aesthetic for the cursed pirates, so Aaron McBride test-photographed beef, turkey, and salmon jerky.

“The turkey jerky felt the best because it scattered a slightly lighter color and was the closest to the right muscle striation texture,” muses McBride, who credits his shyness at speaking up in dailies for the process, laughing, “I pushed to do the concept art as photo-realistically as possible mostly because I wanted to be able to point to the art and not have to say anything.” As Nakabayashi explains, “Aaron took a plastic skull, a bunch of costumes, and turkey jerky, scanned it, and put all these textures on the face. This gave Gore a direction for his movie, and it was a huge moment. Everybody was trying to copy Aaron after that. We still drew and did other traditional methods, but all of a sudden, everything had to be photoreal.”

Concept art for Pirates of the Caribbean: Curse of the Black Pearl (2003) by Aaron McBride (Credit: Disney & ILM).

Another Dimension

“Getting photoreal is a lot easier now with three-dimensional tools,” Nakabayashi adds. “A quick sketch might happen, but a lot of our artists are excellent at building and designing 3D packages. It’s a great transition point from concept art to visual effects work, because of the digital assets.” 

As an intern, one of McBride’s first jobs actually involved developing photos taken for The Mummy (1999) and scanning them into a computer to be painted. He later experimented with 3D on Star Wars: Revenge of the Sith (2005), and then leaned into it while working on the suit-up machine in Iron Man (2008). “I didn’t have the skill set to do mechanical drawings, so I blocked it out in 3D to figure out how some of the panels and other mechanical things moved,” chronicles McBride. For The Avengers (2012), McBride designed a snake-like creature that dropped soldiers into the Battle of New York. Inspired by the Greco-Roman aesthetic crafted by the Marvel art department, McBride envisioned the troop transport “as a Roman galleon, almost like a biomechanical being, which had fins that were like oars.”

“We have a couple artists in the department who are sort of hybrid artists,” remarks Nakabayashi. “They do 3D, 3D animation, compositing, and things like motion graphics. Sometimes, we want to bring a flat, still drawing to life, and you’ll do a quick projection. Making something move is a huge component for success in your pitch meetings. The animatics these days are so good and so accurate that you can’t deny the distinction. They’re more productive than storyboards.”

Concept art for Indiana Jones and the Kingdom of the Crystal Skull (2008) by David Nakabayashi (Credit: Lucasfilm & ILM).

A Generational Shift

Having contributed to a variety of ILM projects as an art assistant, Chelsea Castro is now making her mark on the next wave of ILM shows as a junior concept artist at the Vancouver studio. Castro, who finds inspiration for her art in everything from books to video game soundtracks, strikes a balance between traditional methods and cutting-edge techniques. “Coming from a 2D background, I tend to sketch as much as I can, and then move on to photobashing [combining photographic and CG elements together into a new piece of art] and texturing,” Castro shares. “Afterwards, I fully build out as much as I can in the 3D software that I’m using. Then I go back into 2D to tweak everything and finalize them.”

Castro sees a fundamental evolution when it comes to storyboarding, explaining, “I feel it has gotten a lot more immersive. We still do the classic line art, but now that you can build whole 3D worlds, I’ve seen storyboards done completely in 3D. Sometimes the artists take their scenes and show it to the client with different cameras set up, like it’s their own film set. The game has changed, but the spirit of it is very much the same,” Castro concludes. “Brainstorming and getting ideas out is great with the new technologies. The refinement is where you fall back on your foundations, techniques, and the skills that you’ve built up.”

Cody Gramstad adds, “3D gets you closer to real-world accuracy. Inherently, as we’re hand-painting things, we have a tendency to make artistic cheats. It’s not necessarily a bad thing in illustration, as we can push that to enhance emotion. But, especially in a live-action context, reality is what makes the world believable – 3D is very useful for that because it takes calculations from the real world. For example, how lighting actually bounces off of different surfaces.”

Bimpe Alliu from ILM’s London studio observes that increased accessibility to 3D software among young people is as vital as the technology itself. “I’ve mentored teenage students who are learning 3D, picking up software like Blender, and learning to model and sculpt,” Alliu details. “Regardless of the gradual transition from hand-drawn paper storyboards to digital storyboards, as well as individual artists’ preferences for 2D or 3D drawing, a combination of those skills are always used to do the work to the best of your abilities,” Alliu asserts. “More people are using different techniques in order to bring together their storyboards. It’s harmonious.”

Concept art for a company holiday card by Chelsea Castro (Credit: ILM).

A Global Approach

As the ILM Art Department’s creative director, Nakabayashi embraces the modern tools bringing our world closer together as he oversees and collaborates with artists at ILM’s international studios. “I’m very much hands off, and I let the artists do their job,” opines Nakabayashi, who jokes, “When something goes wrong, I get called upon.”

For Cody Gramstad, being an art director in ILM’s Sydney studio means handling multiple shows simultaneously. “I meet with visual effects supes and give guidance for the shot sequences and how they’re progressing, and at the same time, provide feedback to the Sydney art department team to guarantee they are targeting the supervisors’ and directors’ goals.” Gramstad points out that the process is often a worldwide effort, regularly involving colleagues at ILM’s other global studios. “We support each other and make sure that we’re getting the work done at the level we need to. Nak and [director of art and development] Jennifer Coronado make certain that standards are equal across the different studios.”

However, informal conversations are just as productive. “There are a lot of art posts and chats. Keeping people inspired becomes really important, and it’s great to have artists around you that can contribute to that. Sometimes, we do design competitions, too,” Gramstad proclaims. “The art directors also sit with the artists every couple months. We break down where we can improve and how to adapt our approach as we move forward on future tasks. There are so many different shows across the world, so they’re all learning different lessons. It takes direct communication to make sure those lessons get spread to all of our studios.”

Concept art for Lilo & Stitch (2025) by Cody Gramstad (Credit: Disney & ILM).

Timelines and Tasks

With video calls continuing to bring our world closer together, ILM’s concept artists are able to communicate with clients and take on projects across multiple continents while working from their respective spaces at ILM’s global studios. This ability allows artists to be flexible in terms of their involvement on any given series or film. “Sometimes, we can be on a show for a day,” says Bimpe Alliu, who estimates that the longest time she spent on a project was her two-year tenure on The Marvels (2023).

Similarly, the timeline is naturally impacted by the stage at which the artists are brought on. A fan of anime who started out by drawing her friends as Dragonball Z characters in her youth, Alliu elaborates on the depth of her tasks, advising, “It can be pre- or post-production. We can be working on plates or creating assets for ourselves. With a recent character design, I was given the previs model and a scan of the actor, so I took those, mishmashed them together, and then detailed the clothes on top of that.”

Watching Iron Man inspired Alliu to pursue her current career, so working on WandaVision (2021) was a full-circle moment for the self-described “massive nerd.” “For the sequence where The Vision is disintegrating, I was designing what the disintegration effects would look like. Not just the space, but also The Vision himself,” Alliu recounts. On ABBA Voyage (2022), Alliu was brought on so early that she “was designing what the room where ABBA themselves would be recording and filming all their motion capture stuff would look like. I even designed baby dragons for a TV show called Lovely Little Farm (2022). They made them into little maquettes, so that was the first time that anything I designed got made physically.”

Concept art for Ant-Man and the Wasp: Quantumania (2023) by Bimpe Alliu (Credit: Marvel & ILM).

A Legacy Earned from Lessons Learned

Despite all the changes that have transpired with concept art and storyboarding over the last half-century, ILM’s history and prestige set it apart as it moved into the present and looks to the future. “ILM has a support structure and legacy that a lot of other studios don’t have. ILM can nudge newer people in the right direction as they learn the lessons that their predecessors have discovered in the past,” Gramstad reveals. “There’s also the sheer amount of variety at a place like ILM. Since so many film studios come to ILM as a source of visual effects experience, we get a huge range of projects. So, more so than any other studio in the world, I think that ILM allows people to be super versatile. One morning, we’ll be working on animated silliness with Sonic the Hedgehog, and two hours later, we’re doing a grounded oil rig on an ocean that has to be absolutely photorealistic.”

ILM’s academic aura benefits its up-and-coming and veteran personnel equally, as Bimpe Alliu resolves, “You don’t have to be the finished article. You’re going to constantly grow, and ILM is always looking for potential. It helps when you’re around people that you can learn from.” Chelsea Castro beams, “At ILM, you feel so included, and everyone shares their time with you. It’s amazing to have access to all these people around the world.” Aaron McBride, who has been with ILM for 27 years, praises ILM’s multi-generational nature for making him a more well-rounded artist, concluding, “New techniques can inform older ones, and older techniques can inform new ones. I’m inspired by what younger artists are doing, and I think it’s important not to dismiss any aesthetic because it’s new to you.”

Concept art for The Sandman (2022-25) by Bimpe Aliu (Credit: Netflix & ILM).

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

The multi-sensory experience is on view in London through February 22, 2026.

Industrial Light & Magic has partnered with Somerset House in the United Kingdom for the major new exhibition, Wayne McGregor: Infinite Bodies. The series of multi-sensory choreographic installations, performances, and experiments were directed by acclaimed choreographer Sir Wayne McGregor, and invite visitors to experience the intersections of dance, art, and technology.

Wayne McGregor: Infinite Bodies at Somerset House is an ambitious exhibition that explores intersections of the body, movement, and technology,” explained Dr. Cliff Lauson, director of exhibitions at Somerset House. “We’re delighted that the exhibition has provided the opportunity for Wayne to collaborate once again with the ILM team on the new commission OMNI (2025). This large-scale, screen-based film is the first work visitors see in the exhibition and is a spectacular introduction to beautifully-rendered dance presented at life-size. We’re deeply grateful for the creative talent and technical expertise that ILM has brought to this landmark artistic project.”

A final render created by the ILM team.


“Wayne came to us with a challenge,” explained ILM visual effects supervisor Matt Rank. “We had previously worked with Wayne on ABBA Voyage, and he wanted an opening piece for his new exhibition.  Initially, that was all the brief we had, so we set about researching Wayne’s previous work, the history of Somerset House, and exploring how we could take digitised human movement, and bring it to life as part of his exhibition.”

Rank further explains that the concept of “looking at bodies internally” became the focus of their presentation. “It’s something that Wayne’s traditional work is unable to visualize, we thought it would be fascinating to start blending the different forms that our internal skeleton, nervous system and capillaries took, and how they could appear when formed from less naturalistic materials, and lit from different angles.”

The equally dazzling and thought-provoking results are now open for visitors to see from October 30, 2025 to February 22, 2026. Head over to SomersetHouse.org.uk to learn more, and watch ILM.com for more behind-the-scenes insights about Infinite Bodies.

A final render created by the ILM team.

The ILM creative director and Jurassic‘s production visual effects supervisor talks dinosaurs and collaborating with Gareth Edwards.

By Mark Newbold

(Credit: ILM & Universal).

Jurassic World Rebirth (2025) has grabbed global audiences by the hand and pulled them back into the savage world of the Jurassic film series, three years after Jurassic World: Dominion (2022) completed the second Jurassic trilogy. Rebirth has also changed the direction of the franchise, focusing on the genetic heritage of the incredible dinosaur creations.

Taking place on the fictional island of Île Saint-Hubert in the Atlantic Ocean, Rebirth shows the terrifying cost of unchecked genetic manipulation as we meet familiar creatures, including the armored Ankylosaurus, the chicken-sized Compsognathus, the crested, acid-spitting Dilophosaurus, the aquatic Mosasaurus, the F-16-sized Quetzalcoatlus, and, of course, Tyrannosaurus rex (albeit a much beefier one than the classic Rexy).

Along with these classic creatures are new franchise stars Scarlett Johansson as Zora Bennett, Mahershala Ali as Duncan Kincaid, Jonathan Bailey as Dr. Henry Loomis, and Rupert Friend as Martin Krebs. Together, they encounter distinctly unfamiliar dinosaurs, including the enormous Distortus rex, the towering Titanosaurus, and the horrifying Mutadon. It’s a one-way trip for anyone visiting the island – but in the capable hands of director Gareth Edwards (Rogue One: A Star Wars Story [2016] and The Creator [2023]) and visual effects supervisor David Vickery (Jurassic World: Fallen Kingdom [2018], Jurassic World: Dominion, and Mission: Impossible – Rogue Nation [2015]), it’s exactly what was needed for a pulse-pounding adventure in the grand Jurassic style.

Director Gareth Edwards, a frequent ILM collaborator (Credit: Universal).

ILM.com had the chance to sit down with David Vickery to discuss the visual effects of Jurassic World Rebirth. We started by looking at how the visual effects field is viewed today compared to a decade ago, when Jurassic World (2015) broke box office records and made the world stare in awe at dinosaurs all over again.

“The visual effects industry has grown in many ways,” says Vickery. “There’s a lot more trust placed in us than there used to be, even 10 years ago. Back then, visual effects were seen as something of a necessary evil – but now we find ourselves much more readily accepted as a department on set. Nowadays, other departments – whether it’s hair, makeup, costumes, special effects, stunts – rely on visual effects to guide how to film something because they know how vital it is that the visual effects work properly at the end of the process. They want to make sure what they’re doing is conducive to how we’re going to work, and it always used to be the other way around, so that’s been a refreshing change.”

On the surface, the art of visual effects may appear to be made up of equal parts skill, ingenuity, knowledge, and creativity, but the field also requires a healthy dose of collaboration, as Vickery explains.

“Over the years, I’ve found myself not trying to figure out how to do visual effects but rather figure out how not to do visual effects, and, as best I can, enable people on set to get what they need. You rely on the expertise of the crew. The camera operators and special effects technicians might have been in the industry for 30 years, and at ILM, we’ve got a bunch of talented artists that are generalists by nature, so that level of trust in visual effects has definitely grown.

“There’s a narrative in the press about how everything is done in-camera,” Vickery adds. “Well, yeah, everything is shot in camera because you can’t ‘shoot’ visual effects. What you’re trying to capture is as many practical things on set as you can because you can’t go back and get it in post-production.”

(Credit: ILM & Universal).

A veteran of three Jurassic adventures (three and a half if you include the 2019 short, Battle at Big Rock, directed by Colin Trevorrow), Vickery has worked with three directors (Trevorrow, J.A. Bayona on Fallen Kingdom, and Edwards), and that means differing styles and methods in bringing the dinosaurs to life.

“I find it interesting, the experiences I have with crews, directors, and producers who want to make their films in different ways,” he explains. “Colin relied heavily on animatronics for his films [Jurassic World and Jurassic World Dominion]. Gareth is much more comfortable with visual effects and wanted a consistency in his aesthetic by relying on effects for all the creatures and dinosaurs. On top of that, there’s a layer of what’s fashionable in movies at the moment. For a while, it was very ‘in’ to be shooting on green screen, or it was fashionable to use animatronics, and that’s what the public wanted to see. Now there’s a desire to see things filmed on location, and there’s an acceptance of visual effects, so filmmakers respond to that in the way they make their films. It’s interesting to see the evolution in how things are done.”

The style and flair shown by Gareth Edwards in his previous films – and his obvious affection for giants, as evidenced by Godzilla (2014) and Monsters (2010) – led producer Frank Marshall and executive producer Steven Spielberg to offer him Rebirth’s directorial seat in early 2024. And with that came a rare skill set for a major franchise film: a vast working knowledge and understanding of visual effects.

“Gareth’s a very distinctive filmmaker,” explains Vickery. “He comes from a visual effects background, so he truly understands how things work. He’s the type of filmmaker that creatively evolves his thought process as a project develops, so he’s totally happy to change his approach and defer some decision-making to later down the line. Visual effects is a great opportunity for him to do that, but he also likes reacting to natural things that happen on the day.”

(Credit: ILM & Universal).

Along with his knowledge of effects, Edwards is also known for his guerilla filmmaking style, something Vickery would learn more about from an Oscar-winning special effects supervisor.

“We worked with Neil Corbould [special effects supervisor for 2000’s Gladiator and Edwards’s The Creator, among others] on Rebirth,” Vickery says, “and I spoke to Neil beforehand because I was trying to find out what Gareth is like, and he said Gareth shoots really long takes. I’d heard of 20-minute or 30-minute takes on Rogue One, but how do you plan for that? How do you rig special effects knowing that Gareth’s going to roll for 20 minutes? Neil said they put a load of stuff out there – loads of mortars, loads of pots, loads of bangs, loads of fires and squibs, and they fired them off. They gave him something to work with, and Gareth reacted to that. Gareth wants to be in the real world, to react to what’s in front of him, and then capture the best version of that.”

That drive to find the best moments, to allow the actors to add their own essence to their characters, and to rarely say “cut,” extends to the visual effects realm as well, where Vickery found Edwards was every bit as open to allowing ILM to find those moments.

“He’s very open with his creative briefs and gives ILM a lot of creative flexibility in how to work,” says Vickery. “He doesn’t look at something and say, ‘What’s wrong with this?’ At ILM, we look at something and try to understand how we can make it better, and I think that’s why we stand out in the visual effects field. We’re trying to figure out how things can be made better, and Gareth does the same. He looks at something and in his head it’s a 7 out of 10, but what do we have to do to make it a 10 out of 10? What do we have to do to make it an 11 out of 10? He’s always going to wonder what would happen if we pushed it a little bit more. Does it break, or is it better? He doesn’t want to leave any creative opportunities on the table.”

Edwards’s naturalistic style formed the bedrock of the film, giving Vickery and the ILM team an opportunity to do things differently, opting for realistic substance over easier, stylistic options.

“Gareth said early on that he never wanted to get into a situation where the dinosaur walks into shot, strikes a pose, and roars. That feels staged,” Vickery says. “When you photograph animals in nature, they do whatever they want, so there were a few golden rules that he gave us. We should never animate a dinosaur unless we had reference of a real animal to use. It didn’t have to be doing exactly what the dinosaur was doing in that moment because you can’t find real dinosaur animation reference, but it could be something like a large animal looking scared or startled. Gareth said if we do that, he wouldn’t question whether the animation or the intent of the performance was correct, so he was very good at not micromanaging.”

(Credit: ILM & Universal).

Another advantage of Edwards’s understanding of the effects tool kit was that it gave Vickery and his team a framework to build the film around before the work began.

“He would construct an edit for us, but because he understands visual effects, he also understands the possibilities of what the shots can be,” notes Vickery. “We often found that the first time you watched them, it was difficult to understand how he wanted the performances of the creatures to play out, but you started to work on it and put it together, and suddenly we were like, ‘Oh yeah, that really works, the timing here’s really good.’ That’s how his visual effects background plays to his strengths because he can see the finish line much more clearly in his head than most other creatives.”

With all the advantages of a director understanding one of the key elements of the production, the process of building the film forged on.

“The first process we go through is laying the shots out and blocking in very basic key frame animation,” says Vickery. “That process takes a long time because it’s all about getting the composition of the image correct. In post-production, 60% or 70% of our time was spent on layout and animation, and then the rest – composites and lighting – was relatively quick. Gareth’s a great cinematographer in his own right, so he’s able to see when the composition of an image works well, and then ILM takes it from there.”

The presence of a T. rex is a Jurassic tradition, dating right back to the 1993 original and through every iteration since. While the Tyrannosaur isn’t always the “star” of the film – as in Joe Johnston’s Jurassic Park III [2001], which introduced Rebirth star, the Spinosaurus, or Jurassic World’s Indominus rex, Fallen Kingdom’s Indoraptor, and Dominion’s Giganotosaurus – the queen of the lizards remains ever-present. With Rexy, the original T .rex from the first six films not present in Rebirth, her starring role went to a new, even more terrifying Tyrannosaur. The new star appears in a sequence inspired by a scene in Michael Crichton’s original 1990 Jurassic Park novel where Alan Grant and John Hammond’s grandchildren, Tim and Lex Murphy, raft their way back to the main complex on Isla Nublar. It presented more than a few challenges.

(Credit: ILM & Universal).

“The T. rex boat inflation scene was an idea Gareth had really early on,” Vickery says. “We looked at it thinking, ‘How do you inflate a raft, have it pop up, and then like a magic trick, make the T. rex vanish?’ On the day, we had a special effects raft that inflated, but it didn’t fit into the bag, so it was laid out and they popped it up, and it inflated and fell into the water, but it didn’t stand up on its end. It took much longer to inflate, so that was always designed to be a reference for us, and then our effects artists worked on that and created this wonderful piece of dynamic simulation that had to be choreographed as a piece of animation to feel naturalistic, slowly dropping into the water. We spent a long time on the simulation of the raft. As a supervisor, that was a thing of absolute awe-inspiring wonder for me. The artist that worked on that simulation did such an amazing job because it’s an incredibly complex piece of dynamics and timing. We ended up with a subtle piece of animation where, just as the raft is inflating, you start to see the T. rex moving and start to get up.”

Bringing this adaptation of a classic Jurassic scene from the pages of the novel to the big screen required some ingenious thinking, especially given the presence of a sleeping T. rex and a noisy inflating raft.

“We played around with the T. rex a number of times because we had to walk a very thin line,” notes Vickery. “If you thought the T. rex was awake, you’d wonder why it didn’t see the girl and eat her, so it had to look like it was asleep, but not so fast asleep that it wouldn’t have the ability to wake up and move off.” Timing was everything. This newer version of the T. rex wasn’t as simple as reskinning the existing T. rex asset; this required a completely new “build” taking into account the larger frame and bulk of this Tyrannosaur (to say nothing of its ability to swim). Vickery has nothing but praise for the team that worked on the project.

“The creature department has a brilliant understanding of anatomy; they could be biologists. We had a modeler on Jurassic World: Dominion who trained as a palaeontologist at university, but it’s more than just figuring out how its muscles should jiggle and how its skin should wrinkle. That gets you to the equivalent of a shop mannequin version of a dinosaur. The challenge is to imbue character into the creature, so it feels unique amongst its own species.” 

“Gareth would say, ‘I want to see 100 T. rex’s perform and choose the best one,’” Vickery continues. “He wanted the Robert De Niro version, not the shop mannequin, so how do we imbue that kind of character into it? Part of it was to go back to the animation reference, so you really understood the creature’s intent, and part of it was making sure it’s performing in a naturalistic way.

“Gareth explained how, if you block a sequence with an actor and the brief is you come in the door and you sit down at the desk and you pick up the pen, then the person who’s the stand-in for the day will walk in the door and sit at the desk and pick up the pen,” Vickery adds. “But when the actor comes in, they’ll walk in the door and they’ll give him a mean look and they give it the De Niro treatment and you get a real performance. We always look for that level of performance, and that goes all the way back down to the anatomy of the creature. Do you know where its muscles are firing to give tension in the neck or in the legs? Gareth was interested in things that a creature would have that weren’t preserved in the fossil record, so that gave us creative licence to add extra fat layers or muscles, or waddle under the neck or flaps of skin in different places that would help give it character, which the shop mannequin version wouldn’t have had.”

(Credit: ILM & Universal).

In Jurassic World Rebirth, there’s an extra layer to the story of the dinosaurs; alongside the “classics,” there are genetic mutations, creatures created while striving to find the perfect mix of DNA – both biologically ancient and contemporary – to create the attractions demanded in the parks. As Vickery explains, these creatures were never intended to be seen by the public.

“It’s not like Jurassic World, where they were trying to create attractions for the park. These are failed experiments to create truer genetic dinosaurs. Before they figured out the exact strands needed to get a Velociraptor, they didn’t get the combination right, so that’s how we got the Mutadon. You’re supposed to feel a bit sorry for these creatures. It’s like Sloth in The Goonies [1985], initially terrifying, but then you feel really sad for him by the end of the film, and he becomes a hero.” 

Vickery adds with a smile, “I’m not sure you feel that sad for the D. rex, but you do understand that it’s biologically limited. It’s got this huge encumbrance on its head. It’s heavy and weighty, and that means it can’t run really fast.”

In addition to the raft and the T. rex, there are plenty of other visual effects in the river sequence. “When visual effects are successful, people don’t notice them,” says Vickery. “The raft, the grass, the tree, and the land mass that the T. rex was on were entirely digital; it wasn’t shot on location. The thing is, no one’s going to look at it and go, ‘That was a visual effect,’ so it doesn’t get the credit it deserves, and that’s something I think about a lot. When you’re so close to a film and you’re working on every single component of it, you inherently know what’s a visual effect and what’s not. You hope to get to a place where people don’t realize what a visual effect is, but you’ll never fool them with a T. rex, right? It’s hard to know where to place your emphasis when you’re discussing or promoting work, and it’s hard to know where to draw people’s attention because I don’t know what you or the person sitting next to you understand to be visual effects or not.”

Another unseen visual effect is water, of which there is plenty in Jurassic World Rebirth. From the Mosasaurus attacking Duncan Kincaid’s boat to the T. rex attack on the Delgado family (Manuel Garcia-Rulfo as Reuben, Luna Blaise as Teresa, Audrina Miranda as Isabella, and David Iacono as Luna’s boyfriend, Xavier Dobbs), water is a constant presence in the film. And the marriage between the real-world filming locations and the pixels of ILM required some heavy-duty work to succeed, as Vickery explains.

(Credit: ILM & Universal).

“We spent six weeks in Malta shooting the boat sequence, and the cast were on the water for two days of those six weeks. I took a drone out with the second unit and did aerial photography for another three or four days, so probably 85% of the 300 shots in that sequence were shot on dry land. There are very few shots which don’t have some element of digital water in them, even the shots that were filmed at sea. Perhaps it’s a little more obvious when there’s a huge dinosaur thrashing around in it. But the fact that only 10% to 15% of those shots have real water in them is another thing that audiences may take for granted when they’re watching the film.

“Our effects department, led by CG supervisor Miguel Perez Senent, started development work on the water simulations when we were still in pre-production,” Vickery continues, “so we had a good six months run up to it because there are 50 pages of script that take place on the ocean. So we always expected that to be the biggest technical challenge we had on the entire film.” 

That massive undertaking required new solutions to work. “We built new water solvers in Houdini [3D visual effects software] to help with the white water, the spray, and the secondary and tertiary splashes as the creatures break out of the water, but it was a massive data management issue because the simulations were throwing huge amounts of data around. One of the sims had over 5 billion points of white water spray and splash, so Miguel developed some really clever techniques and tools to help us identify and break those simulations up into regions to make the caches and the sims more manageable.”

The technical aspects of the process are groundbreaking, exactly what audiences and followers of ILM have come to expect over half a century of innovation, but the glue that seals the effects to the physical action requires an artist’s touch.

“Beyond the technical side of it, there’s the visual artistry of being able to blend and match the look of water in Thailand, where it’s slightly greenish-tinted water where you can see through to the rocks and the coral beneath the surface, or the slightly deeper, bluer waters of Malta,” explains Vickery. “And then being able to make sure we’re matching all the different lighting conditions that we had throughout the time we were shooting in Thailand and Malta, on the tank, and on the stage, and all the while trying to live up to Gareth’s standards of cinematography and lighting.

“We had John Mathieson on the show, arguably one of the finest cinematographers alive today,” Vickery adds, “so we’re trying to match our work to the best in the world, whether it’s cinematography or special effects. The artists at ILM stand toe-to-toe with all of those departments.”

(Credit: ILM & Universal).

To evoke the look and feel of the 1993 original, Gareth Edwards chose to film Jurassic World Rebirth on 35mm film using Panaflex Millennium XL2 cameras and vintage C and E Series anamorphic lenses from Panavision, closely following the equipment used by Steven Spielberg over 30 years ago. In days past, such a decision could have caused issues, but decades into the digital age, Edwards’s choice was purely aesthetic.

“I’ve gone back and forth between digital and film with the projects that I’ve worked on at ILM and elsewhere,” says Vickery. “Jurassic World: Fallen Kingdom was fully digital, Avatar: The Way of Water [2022] was digital, Mission Impossible – Rogue Nation [2015] was shot on film, and this Jurassic was shot on film, so I don’t really have any skin in the game as to which one I prefer because it’s the difference between painting on a wall or painting on a canvas. There’s a texture that’s unique to film, which I really enjoy. There’s the grain, the emulsion, the chromatic aberration, the distortion, the shallow depth of field. It helps you bed the visual effects into something that feels real. Gareth wanted the aesthetic of the original Jurassic Park, not the narrative or the characters, just the aesthetic. He wanted it to be as if Universal had gone into their film archives and found something they shot 30 years ago. It felt quite nostalgic at times.”

With a career spanning well over two decades and a role as creative director of ILM’s Mumbai studio, one could be forgiven for thinking David Vickery knows all there is to know in his field. But in an arena built on innovation and creativity, he continues to learn from others and add that knowledge to his own, including from Rebirth’s director.“Gareth would say, ‘Don’t be afraid to try new things,’” the visual effects artist concludes. “When we started designing these creatures, his remit was to do little pencil sketches, so if you give him 13 ideas for a dinosaur, he’d be upset if seven of them weren’t so entirely stupid that we couldn’t use them because we hadn’t pushed the envelope far enough. He didn’t want 13 really safe ideas because we would look back and think, ‘What if we’d pushed it a bit harder?’ It’s much easier to dial back something crazy and make it truly excellent than it is to force something average to be ‘good enough.’ The flip side of that is oftentimes on a film you’ll come up with an idea and you push it harder and harder, and you try and try, but it doesn’t work, so it’s also about knowing when you should tear it up and start again. I feel like I learned a lot from Gareth.”

Vickery pauses for a moment. “He did an amazing job on Rogue One, so if he ever does another Star Wars, I’m in.”

(Credit: ILM & Universal).

Jurassic World Rebirth is available to stream on Peacock beginning October 30, 2025.

Mark Newbold has contributed to Star Wars Insider magazine since 2006, is a 4-time Star Wars Celebration stage host, avid podcaster, and the Editor-in-Chief of FanthaTracks.com. Online since 1996. You can find this Hoopy frood online @Prefect_Timing.

ILM.com is showcasing artwork specially chosen by members of the ILM Art Department. In this installment of a continuing series, three artists from the San Francisco and Vancouver studios share insights about their work on the 2025 Netflix production, The Electric State.

Supervising Art Director Fred Palacio

The primary goal for this piece was to expand the universe of the book by introducing a new character type of “humanoid drones.” A secondary objective was to give the drone a distinct identity — customizable much like how we personalize our phones. 

We started with some existing designs but needed to push further to keep the spirit of author Simon Stalenhag. Take Wolfe, for example: it features numerous subtle details and paint choices that lean into a biker aesthetic, ranging from purely decorative elements to practical tools. Its unique color scheme further pushes this visual language. In contrast, the Security drone is designed to feel more industrial and less customizable. One of the most challenging aspects was the head design, which needed to resonate with the book’s visual style. 

Designs should always speak to the client and audience alike, while staying true to a consistent visual language. The design must strike a balance between distinctiveness — so the character can be recognizable in any light situation — and coherence with the established universe. Struggle is the challenge, which we love. Our job is to embrace it and transform into an achievement, where a piece concept becomes a living character on screen.

Senior Art Director Alex Jaeger

These were the outcast Scav bots. They were also an attempt to add back the more creepy and somber tone that Simon Stalenhag’s artwork contained. The goal for these was to design a series of bots that were “collectors.” They each had a theme and that theme is what drove the design and personality of each one.

The process for creating these was to first block out some overall shapes for the proportions, so that they would read in the shadows since these would mostly be seen at night. Second was to block them out in 3D and begin the themes of each, adding more complexity with each pass. Then came the balance of tone. I created cleaner, brighter versions, then dirty and desaturated versions, then darker, moody versions. The end product was a mix with the colors only coming through brighter around the heads to draw the eye, and the rest faded off. There were also several passes for the amount of cables and adjustment of scale. This final version was the midway level of cables and adjusted scale.

When it came time to do the dirty textures I found some great inspiration from abandoned theme parks. To see how some painted graphics faded while others almost got better with age and gathered great patinas even over brighter colors. 

Senior Concept Artist Kouji Tajima

They are two robots from the mall, named Tacobot and Pianobot. They were probably robots that worked in the mall’s restaurants or music stores. The biggest challenge was figuring out how to get the style as close as possible to Simon Stalenhag’s. To do that, I studied the kinds of parts and cables he uses in his paintings.

When I was designing Tacobot, I studied the textures of a lot of advertising figures and signs from real taco and hamburger shops. Because of that, I gave the taco ingredients on its body — like the vegetables — a painted look instead of a realistic texture.

Normally, I don’t do any sketches. I go straight into 3D software and develop a few different ideas. In addition to this version, there was also one with a face and a mustache in the middle of the taco!

See the complete gallery of concept art from The Electric State here on ILM.com.

Learn more about the ILM Art Department.

Watch The Electric State on Netflix.

Drew Struzan’s art became part of filmmaking mythology, from Star Wars posters to ILM’s classic emblem.

Drew Struzan distilled movie magic into a single iconic image, often the one audiences saw first. The celebrated creator of hundreds of one-sheet movie posters that blended classic portraiture with cinematic montage passed away October 13 at age 78.

For decades, the renowned artist and Industrial Light & Magic moved in the same creative orbit — ILM conjuring the visual effects that brought cinematic worlds to life, and Struzan capturing their spirit in paint, in turn inspiring generations of ILM artists.

Struzan was instrumental in helping define ILM’s early identity. Working from a design by ILM matte painter Michael Pangrazio, Struzan hand-painted the company’s first logo, depicting a tuxedoed magician conjuring a spark of light, framed by a large gear bearing the letters “ILM”. Struzan’s painting was a perfect visual metaphor for what ILM represented: the fusion of artistry and technology, imagination, and precision.

The font style used in ILM’s current logo, revealed in 2023, was closely inspired by the typography in Struzan’s painting.

Struzan’s logo became shorthand for creative excellence. It appeared on letterhead, production slates, and crew gear, signaling that the work within carried ILM’s signature blend of craft and wonder.


His association with Lucasfilm began in 1978, when he collaborated with Charles White III on a special re-release poster for Star Wars: A New Hope. The “circus-style” artwork — with its layered texture and weathered look — became a favorite among fans and collectors, launching Struzan into a long association with Lucasfilm and, by extension, ILM.

Over the next three decades, Struzan’s brush defined the visual identity of films that spanned the Star Wars and Indiana Jones series, Back to the Future, E.T. the Extra Terrestrial, and many more. His posters for Star Wars entries The Phantom Menace, Attack of the Clones, and Revenge of the Sith threaded prophecy and tragedy through glowing color and emotional expression. Each piece was created by hand: sketches on gessoed boards, acrylics and airbrush for texture, and Prismacolor pencils for final detail.

In the same way his movie posters gave films an emotional face, Struzan’s ILM logo gave the company one — a timeless emblem for the artists who turned imagination into illusion.

Read more at Lucasfilm.com.