ILMVFX

50 Years | 500+ Film and TV credits | 135+ Awards

SINCE 1975

Former Industrial Light & Magic artists join ILM.com to reflect on bringing the pre-digital cinema classic to life.

By Clayton Sandell

ILM modelmakers at work on the Inferno. L to R: Chuck Wiley, Barbara Galucci, Bill George, Randy Ottenberg (Credit: ILM).

During the summer of 1985, The Goonies hit movie screens and became an instant audience favorite. The timeless adventure tale follows a group of kids on a quest to discover One-Eyed Willy’s hidden pirate treasure, avoid a trio of ruthless family crooks, and save their homes (and way of life) in the “Goon Docks” of Astoria, Oregon.

While it’s not considered a massive visual effects film, part of the enduring charm of The Goonies is thanks to around 20 shots created by Industrial Light & Magic. Forty years later, four former ILM veterans share their memories about working on the celebrated classic.

ILM’s Michael McAlister was hired as the film’s visual effects supervisor, his first time in the role after working as an effects cameraman on projects including E.T. the Extra-Terrestrial (1982), Star Wars: Return of the Jedi (1983), and Indiana Jones and the Temple of Doom (1984).

Dave Carson brought extensive ILM experience to the role of visual effects art director on The Goonies, with credits including Star Wars: The Empire Strikes Back (1980), Dragonslayer (1981), and Star Trek III: The Search for Spock (1984).

The work of The Goonies matte painter and fine artist Caroleen “Jett” Green has appeared in dozens of films, including Willow (1988), Ghostbusters II (1989), and Star Wars: The Phantom Menace (1999).

Before a fruitful run as a visual effects supervisor, Bill George helped build a number of iconic models for films including Star Trek: The Motion Picture (1979), Blade Runner (1982), and Explorers (1985).The Goonies was directed by Richard Donner (Superman [1978], Ladyhawke [1985]) from a story by Steven Spielberg and a screenplay by Chris Columbus. Frank Marshall and the now president of Lucasfilm, Kathleen Kennedy, were among the producers.

The production team conducts a location scout on the Oregon coast (Credit: ILM).

MICHAEL McALISTER, VISUAL EFFECTS SUPERVISOR: Number one, Dick Donner was such a good man. His personality was so big, and he spoke with a booming voice, and he was just confident and gentle and kind. I was really impressed with him. It was a real joy to be around him. I also had good crews at ILM, and the experience of being on location in Astoria, Oregon, which is absolutely stunningly beautiful, was delightful.

DAVE CARSON, VISUAL EFFECTS ART DIRECTOR: It had so many effects shots in the first draft. I remember being in a meeting in Burbank early in the production. I don’t think Dick Donner was even there. And we were talking about the effects. And I said, “Well, I think eventually there’ll probably be like 80 shots.” The blood drained from everybody’s faces. I could see that was not where they were headed. It still was a great project, but the number of shots kept dwindling. The first draft had skeletons that came to life. It was full of effects and fantastic stuff.

I started just by drawing scenes from the script. Nobody asked me to, but you can’t read that script without wanting to draw some of the scenes in it. J. Michael Riva was the production designer, and he was cranking out beautiful stuff. [Art director] Rick Carter made beautiful blueprints. They were establishing the look of this film, and it was great. From that point on, my actual work for the production was pretty much taking established background plates and indicating where the effects would go. There wasn’t too much pie-in-the-sky stuff. I did a bunch of storyboarding of the sequence where the kids run into the cove, and they see some skeletons and they get on the ship.

Concept art by Dave Carson depicts the unfinished sequence when the Goonies are attacked by a giant octopus (Credit: ILM).

The ILM Model Shop built a highly detailed scale version of One-Eyed Willy’s sailing ship, the Inferno. Under the supervision of Barbara Gallucci, Bill George led a model-making team that included Randy Ottenberg and Chuck Wiley. ILM had plenty of previous experience with model spaceships, but building a wooden pirate galleon was something the crew had to learn from scratch.

BILL GEORGE, CHIEF MODELMAKER: I was really happy to be put on the project leading the construction of the miniature pirate ship. We wanted to do a good job and do something impressive that would get people talking. We put more into the model than we needed to. The production provided blueprints, which were amazing. We read books on building miniature ships and had the opportunity to do research and learn. We went to San Francisco Bay to study the Balclutha, which is a vintage wooden sailing ship. We studied all the details, the belaying pins, the rigging, the wood texture and wear. We wanted our model to look as authentic as possible.

We started with stanchions, very much the way you would build a boat. Those were covered in thin sheets of balsa wood. One of the big technical challenges on this was the rigging and the sails. Randy’s main focus was the sails. And, of course, there were no computer graphics that were advanced enough to do CG sails at that point. So the decision was made to make them out of a very, very fine silk, which would blow in the wind, and the silk was also great because it was transparent and pure white. Once again, we did some research. We found that we could use coffee and tea to stain the sails so they had a little bit of a warmer, aged color without stiffening the fabric. At the time Goonies came along, ILM had established itself as the visual effects house of choice for very successful films. Then there were all these films that Spielberg was producing, including The Goonies and Explorers and Back to the Future [1985], and all of them kind of funneled through ILM. It was a really exciting time because there was a whole diversity of interesting projects coming in.

Chief modelmaker Bill George at work on the Inferno (Credit: ILM).

MICHAEL McALISTER: It was unbelievably beautiful. But by the time the model was in the process of getting made, they decided to just go ahead and build the entire set on the soundstage. Which then meant that we didn’t need as many shots using the model.

BILL GEORGE: I was a little disappointed because we didn’t get to showcase it as much in the film. It was very backlit, and it was very far away, and I knew that the model could hold up. So it was a little bit of a disappointment. But I’m super proud of the model we built.

On deck, there’s even a little R2-D2 Easter egg. It was actually a casting from Star Wars. In the model shop, we had molds of the castings that go with the plug at the top of the X-wing starfighter. That’s what that was.

In 2023, the Inferno model was donated to the Academy Museum of Motion Pictures by Richard Donner’s widow, producer Lauren Shuler Donner.

The hidden R2-D2 figure from Star Wars tucked away on the deck of the Inferno (Credit: ILM).
Modelmaker Randy Ottenberg at work on the Inferno‘s masts (Credit: ILM).

Production designer J. Michael Riva had the Inferno and a water-filled cavern built as a full-size, practical set on Stage 16 at the Burbank Studios (now Warner Bros.) in Southern California.

MICHAEL McALISTER: I’ll never forget it. It was the most impressive thing I’ve ever seen in my entire movie career, hands down. The first time I walked on the stage, here’s this full-size pirate ship. And every little glorious detail was just striking.

The director of photography, Nick McClean, was going back and forth to another stage at the same time as he was trying to light this pirate ship, and it wasn’t working out very well. He just didn’t have all that much time to be on Stage 16.

So he just turned to me and said, “Michael, light it for me,” and walked away. I was like, “Oh my God, I don’t know how to light a set!” I was freaking out because I didn’t want to come up short. I didn’t want to disappoint him, didn’t want to embarrass myself. And I remember thinking, “How would you light it if it was a miniature, and just scale it up?” So that’s what I did.

You just got thrown into something, and you had to figure it out. So Nick came back, and he looked at my lighting, and he was pretty happy. Only changed one thing. I learned something about confidence, and I learned something about lighting. It doesn’t really matter how big a thing you’re going to light. It’s all the same idea.

The ILM team made visits to the Goonies sets in Burbank to capture reference photography. Here the Inferno and surrounding cave is under construction (Credit: ILM).

DAVE CARSON: It was an amazing thing to see. One morning on the set, there were probably a dozen of us all standing around drinking coffee, and Steven Spielberg walks in and he’s looking around. We’d met a few times, but he didn’t know me all that well. He says, “So what do you think?” I said, “There’s some great shots here,” and he says, “Oh yeah? Where?” I’m thinking, “Is he kidding me?” I was just trying to be conversational. But I decided I’d just follow through. So I walk over to the island with like twelve people following Steven, and I got down, just trying to find some interesting angles. I don’t know what he made of it all.

Visual effects supervisor Michael McAlister wades in the water tank on the Inferno set (Credit: ILM).

For wide shots of the Inferno, ILM artists Frank Ordaz and Caroleen “Jett” Green created matte paintings to help complete the illusion of a tall sailing ship rising beyond the limited height of the Burbank soundstage. Chris Evans served as matte painting supervisor.

CAROLEEN “JETT” GREEN, MATTE ARTIST: They had that big ship that they shot in a way that, at the last minute, they needed to extend the masts and add sails. We had to work quickly to make it all work perfectly.

The challenge was, we didn’t have much time, and the sails of a ship needed to have fluidity, an airy quality. Our matte painting extensions were static, so lucky for us the shots of the sails were only on for a couple of seconds.

I knew how to paint something realistically. What you also learn with matte painting is how to change lighting. You need to know what goes on with light, whether it’s indoors or outdoors, how it affects everything. If there’s a blue haze that’s moving in the shot, I might add some carefully mixed blue paint to match. It all got combined together.

I was an apprentice matte painter, learning the techniques and skills in order to become a great matte painter. I was working in a room with highly creative people, all excellent at what they do. I really wanted to keep up with these guys. And I told myself, well, “I’m just going to put in 150%.”

Another ILM contribution includes what might be considered an early example of a so-called “invisible effect.” Searching for their next clue, Mikey (Sean Astin) lines up a doubloon with cutouts to match rocks and a lighthouse in the distance. What appears to be a practical shot is actually a mix of multiple blue screen elements, background plates, and matte paintings. A complex rack focus helped complete the illusion.

DAVE CARSON: I remember the challenge at the time on the doubloon shot was they wanted the doubloon in focus and crisp up close. That means anything in the distance is going to be soft. So they had to pull off the rack focus in post-production.

MICHAEL McALISTER: One of the reasons that the shot was never attempted on set is because the rocks in the ocean didn’t exist. And they certainly didn’t exist to line up with the doubloon. So, based on that criteria, it automatically became a visual effect. And dealing with the rack focus was very challenging during that time because it was all optical printer composites, and you didn’t get good mattes out of blurry edges in the optical process. Today, it’s not an issue with all the CG capabilities and the compositing software, but it was a challenge at the time to get that right.

A storyboard by Dave Carson (Credit: ILM).

The organ chamber sequence – in which an incorrectly-played musical note causes part of the floor to fall away and reveal a treacherous cavern below – was achieved using five different matte paintings and a 16-by-20-foot miniature set featuring stalactites, pools of water, and fog. The original set was scaled down in size during pre-production, posing a challenge for creating the critical illusion.


MICHAEL McALISTER: The concept was supposed to be something that instantly communicated absolute death if you fell down there. That was one of the hardest things I’ve actually ever done in my career, creatively. And to this day, I’m not really happy with what that image communicates because it didn’t look like instant death to me. Richard [Donner] and [Steven] Spielberg didn’t ever complain to me about it, but I wasn’t really happy with that. It was supposed to be all misty and foggy, which made the lighting so diffuse that it was just really hard.

The ILM camera crew prepares to shoot the miniature from the ground up. A mirror was used for reference while standing (Credit: ILM).

Four decades later, The Goonies continues to be treasured by fans young and old. In 2017, the Library of Congress added the title to the National Film Registry, which honors movies with cultural, historical, or aesthetic significance.

DAVE CARSON: It’s so funny. Of all the films I’ve worked on, when people find out I worked on The Goonies, a lot of times that’s the one that they’re impressed by. “Oh, you worked on The Goonies? I love that movie!” Yeah, it’s still a very popular film.

BILL GEORGE: The story reminded me of when I was a kid with my buddies, and we were looking for adventure on the street, throwing dirt clods, that kind of stuff. It really captured the essence of that in a really magical way. And I think for kids that age, they’re like, “Hey, let’s make this happen. Let’s find the treasure.” Goonies have a special place in our hearts.

CAROLEEN ‘JETT’ GREEN: We were all seriously into what we were doing: matte painting.

I considered many of the artists geniuses. Just a brilliant group of creatives. We would start painting at around 10 o’clock in the morning and go into the zone of silence for hours. Then we’d come up for air at the same time, lunchtime or later. At times, I would even stay until sunrise. 

MICHAEL McALISTER: It is meaningful to me that there are a few films that I’ve worked on that are classics and will always be remembered. During The Goonies, I had a hunch about it because every kid dreams about finding a pirate ship and a pot of gold. I can’t take any credit for the fact that these movies have such legacies, but it’s nice to have been involved with a movie that made such a dent and endures.

When I first walked the halls of ILM, I realized I was walking among the best in the world at what they do. It was just such a privilege to be in that company, in the company of those artists, that level of creativity and expertise for so many years.

A doodle by an ILM crew member on the Inferno model during its construction (Credit: ILM).

Clayton Sandell is a Star Wars author and enthusiast, Celebration stage host, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell), Bluesky (@claytonsandell.com), or X (@Clayton_Sandell).

The ILM visual effects supervisor speaks on ILM’s contributions to the blockbuster film that brought Marvel’s First Family into the Marvel Cinematic Universe.

By Jay Stobie

(Credit: ILM & Marvel).

Marvel Studios’ The Fantastic Four: First Steps (2025) transports audiences to the Marvel Cinematic Universe’s Earth-828, where Reed Richards (Pedro Pascal), Sue Storm (Vanessa Kirby), Johnny Storm (Joseph Quinn), and Ben Grimm (Ebon Moss-Bachrach) must prevent Galactus (Ralph Ineson) and his herald Shalla-Bal (Julia Garner) from destroying their entire planet. Directed by Matt Shakman, whose acclaimed credits include helming episodes of the long-running comedy series It’s Always Sunny in Philadelphia (2005-Present) and the mystical Disney+ hit WandaVision (2021), The Fantastic Four leans into a retro-futuristic aesthetic that blends 1960s-inspired designs with out-of-this-world technologies.

With this innovative endeavor in mind, the filmmakers called upon Industrial Light & Magic and its accompanying half-century of visual effects expertise to help execute Shakman’s vision, with a particular focus on The Thing, Galactus, the climactic third act battle in New York City, and more. Daniele Bigi (Ready Player One [2018], Star Wars: The Rise of Skywalker [2019], Eternals [2021]), who served as the ILM visual effects supervisor on The Fantastic Four, sat down with ILM.com to discuss the company’s numerous contributions to the project, from devising a fresh approach for portraying The Thing’s rocky features to constructing Earth-828’s distinctive New York City skyline.

An ILM Overview

As the ILM visual effects supervisor on The Fantastic Four, Bigi spearheaded ILM’s involvement on the project from the company’s London studio, working closely with invaluable colleagues like ILM animation supervisor Kiel Figgins and ILM senior visual effects producer Claudia Lecaros. “In this case, ILM didn’t split the work between multiple ILM facilities, so my team ended up keeping all the asset and shot work in London. We were assigned the major task of handling the third act of the movie, which centered on the final battle between the Fantastic Four and Galactus,” Bigi tells ILM.com. “Although it’s divided into multiple sequences, the third act is a continuous narrative from Galactus’s arrival on Earth through the end of the film. It was a fascinating and important piece of work to deal with.”

ILM’s assignment included devising an innovative look for Ben Grimm’s iconic alter ego, The Thing. “We did all of the initial development with [production visual effects supervisor] Scott Stokdyk and [visual effects producer] Lisa Marra from Marvel, in collaboration with [head of visual development] Ryan Meinerding. Ryan provided us with the concept for The Thing, which is what we based our work on,” Bigi relays. As the leading vendor for The Thing, ILM developed the entire character and then distributed the asset to the film’s other visual effects vendors for their own sequences.

(Credit: ILM & Marvel).

“After the initial development of The Thing, we were assigned another prominent character to build. Since ILM had several shots in which Mister Fantastic stretched his body and used his ability in an extreme way during the final battle, ILM ended up leading the look development of Reed Richards, too,” Bigi explains. In January 2025, ILM’s success with these character creations prompted Matt Shakman to task Bigi’s team with crafting the Fantastic Four’s immense nemesis, Galactus.

“Another big component to ILM’s work was the development of New York City, which was an imaginary version of it based on Marvel concept art,” Bigi continues. “Roughly 90% of the New York City shots were done in computer graphics by ILM. It’s a 1960s futuristic New York, and while certain aspects appear exactly like our New York, there are many buildings and stylistic elements that reflect both 1960s and futuristic designs. A large section of the city, including Times Square, was ingested from Sony Pictures Imageworks, whom ILM collaborated closely to combine different city blocks into a unified layout with a matching style, color palette, and overall look.” Most of the city set-up was handled by environment supervisor Stacie Hawdon and CG supervisor Tobias Keip at ILM’s London studio. In total, Bigi estimates that ILM contributed between 350 and 380 shots to The Fantastic Four.

Thinking the Thing Through

“At ILM, we aimed to deliver on Matt Shakman’s vision by dramatically changing what had been done with The Thing in the past. We sought to create the most believable, realistic performance that would respect Jack Kirby’s original design, from the size of the rocks to the very specific rock formation of The Thing’s brow,” Bigi shares. Animating facial expressions for a character whose face is composed of rock proved to be a considerable challenge. “We explored different options, but I always wanted to keep the rocks as rigid as possible. If we started to squash and stretch them, The Thing would resemble what was done in the past with plastic material and foam prosthetics.”

(Credit: ILM & Marvel).

Leaning into The Thing’s bouldery frame, Bigi’s team created small, undefined gaps between the rocks. “Depending on the expression, we could move the rocks in these minuscule spaces. Additionally, we allowed the rocks to gently stretch in areas that were invisible to the camera, giving us larger gaps that let us keep the rest of the rocks completely rigid.” ILM employed another sophisticated technique for The Thing’s face and body, running an effects simulation on the rocks rather than dealing with geometric skinning. Bigi praises FX and creature technical director Maybrit Bulla, who used Houdini to create a custom setup to control the collision between the rocks. “We used our blend shape technology to move the underlying surface, but there are rocks on top of it that are actually colliding. They push each other and land in a natural position. In some shots, we had to guide the simulation in an artistic manner to avoid having rocks go into unwanted territory and seem weird or strange. The process is something new that we developed for this movie.”

Ebon Moss-Bachrach as The Thing (Credit: ILM & Marvel).

When it came to actor Ebon Moss-Bachrach’s performance capture for The Thing, ILM referenced the work-in-progress geometry data from Digital Domain (another effects vendor on the film). “The data was useful for the initial stages and the blocking animation, but when we started to go into the minutiae with Scott Stokdyk and Matt Shakman, we ultimately worked on our own system and reanimated the character for our final animation,” Bigi details, crediting CG supervisor Marco Carboni for developing a workflow to quickly ingest data from Digital Domain and transfer it to ILM’s proprietary facial rig.

Rules for Reed Richards

Alongside Shakman, ILM outlined clear guidelines for Reed Richards’s capabilities as Mister Fantastic. “Matt was keen to avoid creating what we called a ‘noodles’ or ‘spaghetti’ feeling. How we controlled the stretch was unique and based on Matt’s vision,” Bigi recalls. “Instead of developing the character for months and then realizing that it didn’t behave in the right way, I proposed exploring various 3D action poses with extreme body stretch from several angles. Matt was incredibly receptive to the notion of rendering these static frames before having a functional rig or muscle simulation for the animator to use.”

Setting rules for Mister Fantastic became essential to ILM’s process. “What can Reed do? Do we want to stretch the neck, or don’t we? We decided not to, so there’s not a single shot where you see the neck stretching a lot,” Bigi notes. “We established a rule that only Reed’s limbs would stretch, meaning his upper torso and shoulders would remain the same width as the actor’s. Another rule dealt with his bone structure. While stretching, his elbows and knees would be more defined, the idea being that the skin was getting thin and wrapping around the bone. This was all discussed with Matt and Scott and developed in the initial stage where we did our 3D maquette action poses.”

(Credit: ILM & Marvel).

Bigi took inspiration directly from Marvel’s comic books, as well. “Many comic book artists before us, in particular Alex Ross, maintained a very strong V-shape when portraying Reed’s upper body. So, in the ILM shots where Reed is stretching, we kept the lat muscles on his body fairly large, like an athlete or swimmer,” Bigi declares. “We also decided Reed would snap his limbs back to a natural pose relatively quickly. The thought was that it wasn’t easy for Reed to stretch, so he would only do so on important occasions. He doesn’t do it for fun, at least in this movie.”

While Reed’s arms and legs stretch extensively, Bigi points to another key decision ILM made when generating the look and feel of Mister Fantastic. “The stretch of his fingers is minimal, and the gloves you see are usually the normal size as established by the practical costume designer. The concept being that, unlike the fabric close to his body, the actual fabric of the gloves didn’t need to stretch at all.”

Seeing Sue Storm

As was the case with The Thing, ILM pursued a unique path to conveying Sue Storm’s abilities in the final battle. “Rather than relying on particle simulation, all of ILM’s Sue effects were based on optical elements,” Bigi reflects. “The Sue effects were meant to be analog, in a way. There are no effects simulations of any kind. Most of those shots were crafted by ourcompositing team, so it’s a 2D-based approach using references of how lenses naturally create refraction and color variation. You see that we enhanced and exaggerated the prismatic fringes that occur with specific types of lenses.

(Credit: ILM & Marvel).

“Although this route was simple in a technological sense, it was nevertheless quite effective visually, and blended well with the atmosphere of the movie,” Bigi concludes. “Going with the latest, state-of-the-art technology is not always the answer. In this case, it was the opposite. We wanted it to feel simple and analog, so we stayed with the real optical effects. It’s all about what the director wants and the feeling you wish to convey.”

Grappling with Galactus

Unlike the challenges that ILM tackled with The Thing’s rocky features, the surface of Galactus’s face resembled the actor to a much greater extent. “We were able to use Ralph Ineson’s performance through a normal blend shape technique for Galactus’s face. Matt wanted to infuse Galactus with a god-like aspect, so he had us downplay the realistic human aspect and micromovements of the actor’s face. We reduced the range of motion and kept the face a bit firmer,” Bigi states. “For the body, we received a scan of the beautifully-constructed costume, but at the end of the day, ILM replaced it with CG in all of our shots because of its need to appear metallic.”

(Credit: ILM & Marvel).

Representing Galactus’s true scale also came into play. “We determined a specific height for Galactus, so the camera had to conform to that size. There are several shots with plate photography, but the majority was done digitally, especially due to the interaction between Galactus and the city,” Bigi reports. “Galactus’s body had to be covered with thousands of tiny lights, which couldn’t be done realistically with prosthetics, and he’s so large that the amount of detail necessary to set the scale was tremendous. We scattered literally millions of tiny pipes, greeblies, and geometric objects to increase the sense of scale. At a distance, our Galactus was the same as the costume, yet it was much more elaborate in the extreme close-ups.”

(Credit: ILM & Marvel).

ILM held conversations with Matt Shakman and Scott Stokdyk about the bridge devices that serve as a centerpiece for the climactic conflict with Galactus. “We developed an effect that we called ‘bridge effects,’” Bigi notes. “The bridge is an amazing device that – spoiler alert – Reed conceived to transport Galactus to another location in space. Because of the 1960s style of the movie, we avoided a digital quality for the portal. We found references and simulated optical effects rather than calling upon inspiration from the digital world. It was a real brainstorm with Matt and Scott. All sorts of ideas, such as having Galactus’s body stream with particles inside the bridge effects, came up in our conversations with Matt.”

A “New” New York

In preparation for depicting Earth-828’s New York City, Bigi traveled to New York for a 10-day shoot with The Fantastic Four’s second unit. “It was an amazing experience,” Bigi beams. “Based on the previs, there were certain shots we knew would be CG, but we tried to film as much as possible. Before going to New York, I used a combination of Google Earth and other digital resources to virtually scout Manhattan and propose methods to capture it from specific locations in a thorough fashion. I spent days capturing 360 HDRI panoramic views, mostly along 42nd Street, that construct a library of texture and material references. At the same time, a small team from Clear Angle Studios scanned the entire road using a LiDAR [Light Detection and Ranging] scan.”

The work continued upon Bigi’s return to London. “Initially, we took the images of New York and removed all the buildings constructed after the 1960s. It was essentially a filter that permitted us to show this version of the city to Matt and Scott,” Bigi remembers. “Then, in collaboration with [production designer] Kasra Farahani and Scott, we drew inspiration from futuristic-looking buildings elsewhere in America, such as Chicago. We selected preexisting real-world buildings that had rounded shapes and concrete bases. Another selection was done by concept artists at Marvel who had come up with original designs.

(Credit: ILM & Marvel).

“My team at ILM modeled those buildings, and we set their number and location along the street. We built several layouts and versions, gradually shaping the features of the street. That aesthetic relied on the props, as well,” Bigi asserts. “The cars and billboards resemble those from the 1960s, and we scattered spherical water tanks around the city. The phone booths aren’t based on their 1960s counterparts, as they were designed specifically for the movie. From the skyscrapers down to minute details like the color of the phone booths, everything is either a combination of real 1960s references or the artistically-driven futuristic elements that are now synonymous with the film.

”The time and talent that ILM invested in The Fantastic Four has paid off for both the artists involved in the project and audiences around the globe. Upon seeing the final cut, Bigi gravitated towards one of ILM’s shots when ranking his top stand-out moments from the project, declaring, “There are several moments that I love, but for me, Galactus emerging from the water and entering Battery Park from the river is my favorite. The water simulation and the composition combine to create a wonderful shot to begin that sequence.” Applauding the work of compositing supervisor Juan Espigares Enríquez and his compositing team, Bigi concludes, “I think it’s one of The Fantastic Four’s most exciting and spectacular moments.”

(Credit: ILM & Marvel).

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.

New details from ILM’s new 50th anniversary book written by Ian Failes have been unveiled.


New page spreads from Industrial Light & Magic: 50 Years of Innovation were previewed at today’s Lucasfilm Publishing panel at San Diego Comic-Con. Written by Ian Failes of befores & afters, this book covers ILM’s 50-year story, from its establishment in 1975 to help create Star Wars: A New Hope (1977) to the latest stories and innovations from across the company’s five global studios.

Packed with hundreds of rare behind-the-scenes photographs and archival artwork, 50 Years of Innovation combines ILM’s distinct history of artistic and technical achievement with the inspiring stories of the people who’ve made it all possible. Dozens of both historic and newly-conducted interviews bring rich insight into ILM’s unique process that has shaped the visual effects art form and global filmmaking industry for half a century.


ILM’s story is one of equal parts change and consistency. Through constant evolutions in tools, techniques, and stories, the company’s artists and engineers have maintained their dedication to the highest standards in quality and innovation. 50 Years of Innovation sheds light on the characteristics that have empowered ILM to reach the half-century mark, and that will continue to guide the company into the next 50 years.

Update: Lucasfilm and Abrams Unveil New Spreads

Brand-new page spreads from 50 Years of Innovation have been shared by Lucasfilm and Abrams, providing an even more in-depth preview of the new book by Ian Failes, including sections covering beloved ILM productions like A New Hope, Dragonslayer (1981), E.T. the Extra-Terrestrial (1982), Die Hard II (1990), Terminator 2: Judgment Day (1991), Jurassic Park (1993), Mission: Impossible (1996), Pirates of the Caribbean: Dead Man’s Chest (2006), The Mandalorian (2019-23), Ant-Man and the Wasp: Quantumania (2023), and Indiana Jones and the Dial of Destiny (2023).


Industrial Light & Magic: 50 Years of Innovation arrives in early 2026, and is now available for pre-order from Abrams, Amazon, and Barnes & Noble.

To learn more about this new book directly from its author, check out this story on ILM.com.

To learn more about the Lucasfilm Publishing panel at San Diego Comic-Con, visit StarWars.com.

Watch the ILM.com Newsroom for all the latest news about Industrial Light & Magic: 50 Years of Innovation.

The ILM 50th anniversary logo alongside the cover of new book, Industrial Light & Magic: 50 Years of Innovation by Ian Failes.

ILM visual effects supervisors Mohen Leo and Scott Pritchard, along with members of their talented crew, discuss the process behind building the TIE Avenger as it journeyed from concept to screen.

By Jay Stobie

(Credit: ILM & Lucasfilm).

For many Star Wars enthusiasts, the word “avenger” conjures up images of Captain Needa’s Imperial Star Destroyer Avenger, the TIE Avenger starfighter featured in Lucasfilm Games’s Star Wars: TIE Fighter (1994) video game, or even Marvel’s prestigious superhero collective. The second season of Andor (2022-2025) has now pushed its own TIE Avenger to the forefront of that list, as the epic series chronicled Cassian Andor’s (Diego Luna) theft of the prototype craft from a Sienar Fleet Systems test facility. Outfitted with advanced armaments and a hyperdrive, the TIE Avenger transported Cassian to Yavin 4 before playing a key role in rescuing Bix Caleen (Adria Arjona) and Wilmon Paak (Muhannad Ben Amor) from Imperial forces on Mina-Rau.

Industrial Light & Magic’s Mohen Leo, whose resume boasts projects like Ant-Man (2015), The Martian (2015), and Rogue One: A Star Wars Story (2016), served as Andor’s production visual effects supervisor for both seasons of the series, while ILM visual effects supervisor Scott Pritchard (Star Wars: The Force Awakens [2015], Avengers: Infinity War [2018], Avengers: Endgame [2019]) oversaw the creative output of the work across ILM’s global studios in London, Vancouver, and Mumbai. Leo and Pritchard gathered alongside CG supervisor Laurent Hugueniot, modeler Owen Rachel, texture artist Emma Ellul, look development artist Renato Suetake, animation supervisor Mathieu Vig, and compositing supervisor Claudio Bassi to chart the TIE Avenger’s course from conceptualization to the completed sequences seen in season two.

Constructing the Concept

The TIE Avenger prototype is first unveiled at the beginning of season two, resting in its Sienar hangar bay before being commandeered by Cassian and embarking on a dramatic escape. “The idea for the opening sequence began with [showrunner] Tony Gilroy wanting to start season two off with a big, classic Star Wars action sequence,” Mohen Leo tells ILM.com. “That initially came out of an outline that Tony gave us in 2022. Early on, a big story point became that Cassian doesn’t know how to fly it, so the Avenger had to have completely unfamiliar controls, and the interior had to look different from any TIE fighter or ship that you’ve ever seen before.”

When it came to the Avenger’s look and layout, Leo worked closely with production designer Luke Hull. “Luke explored various prototype airplanes, and then we played around with the idea of what the ship needed to do in terms of the chase sequence. We wanted something that wasn’t just a dogfight. If he immediately jumps in and it’s just a chase, it’d be difficult to do something original with that,” adds Leo. “Luke had already decided to build a full-sized practical TIE Avenger. As far as its physical construction, the wings were inspired a bit by the TIE interceptor.” While Andor’s Avenger shares a name with the craft from the TIE Fighter game, it maintains its own design lineage. “I don’t think the previous ship was a strong influence,” Leo begins. “When we were designing our Avenger, how the ship functioned became something Luke and I reverse-engineered based on what we wanted the ship to do. That dictated the look.

(Credit: ILM & Lucasfilm).

“I put together a pitch deck for the Avenger before we got into previs when the directors weren’t even on yet,” Leo continues. “I wanted to make sure that the ship and the weaponry, in particular, were based on real weapons and felt both dangerous and aggressive. The team based the TIE’s fold-out Gatling-like cannons on the United States military’s M61 Vulcan rotary cannon. Hull wanted the Sienar base itself to feel like a Skunk Works test facility or NASA’s Jet Propulsion Laboratory.

“It was a back-and-forth between previs and Luke Hull and the art department in terms of the ship’s design,” Leo notes. “We’d say, ‘We need guns that fold out,’ and then Luke would go, ‘Let me see where we can fit those in.’ It was almost as much driven by the necessity of the functionality as it was by the aesthetics.” Once the additional armaments, including external launchers and a powerful cannon below the cockpit, were set, the team pitched the action sequence to Tony Gilroy. “We blocked the whole sequence with a temp model, and Tony generally really liked it.”

Assembling the Avenger

“As the studio-side visual effects supervisor, I was involved in pre-production through to the end,” Leo shares. “I also guided the previs development with The Third Floor’s Jennifer Kitching and collaborated with Luke Hull on how we would make the practical build service what we needed to do in the visual effects sequence.” The creative dialogue between the ILM visual effects team and the production designer was vital in ensuring that the computer graphics (CG) starfighter built by ILM would be identical to the full-sized practical ship that was constructed to be used on the Sienar hangar, Yavin 4, and Mina-Rau sets.

“Because we had a practical version of the Avenger during the shoot, we were able to scan that and provide lots of references,” Leo details. As such, the full-scale Avenger proved beneficial for Owen Rachel, the ILM modeler responsible for building the CG Avenger. “My job was to take [the practical model] and replicate it as a digital version. There wasn’t much design work that we had to do other than replace some structural bits, like the Gatling guns as they come down,” Rachel outlines. “We did have to create the laser cannon that comes out from underneath the cockpit. It’s an intricate design because it’s both delicate and powerful at the same time. It felt a bit like the inside of a watch,” recalls CG supervisor Laurent Hugueniot, who was in charge of the team’s 3D output.

The practical TIE Avenger prop on set at Pinewood Studios (Credit: ILM & Lucasfilm).

Texture artist Emma Ellul tackled the job of texturing the TIE Avenger. “We had great reference images, which was super helpful. I tried to focus on real-life objects, too, such as stealth planes. They’re quite smooth and angular, and I’d see how specularity affects the metal. Not every sheet of metal is made the same, so it has a slightly different bend or warping to it. I incorporated that, especially on the paneling on the outside of the wings,” Ellul relays. “There were a lot of nooks and crannies to look at and a lot of small decals everywhere, which I had to match one-to-one with the practical model. It was an awesome asset to texture. Who doesn’t want to texture a TIE fighter? And then I had to destroy it and chuck laser blasts all over it [laughs].”

Look development on the Avenger was handled by Renato Suetake, who asserts, “As a look dev artist, I get the model and textures so I can put them together and make the shaders. I make sure the shaders and materials react precisely like the reference we have in any situation or lighting condition. Certain shots jumped between the prop filmed on set and the CG version, so the digital Avenger had to be identical. At the same time, because the prop wasn’t made of metal, we still had to make it believable as an actual spaceship that flies.” From the Avenger’s weapons to the hangar explosions to the collapsing ice arch, Leo also credits the effects artists who contributed to the sequence, revealing, “In general, all of those things are just massive, complex simulation work.”

Pairing Computer Graphics and Practical Effects

As Andor’s ILM visual effects supervisor, Scott Pritchard helmed his team at ILM’s London studio, while coordinating the work at ILM’s studios in Vancouver and Mumbai. When it came to the Avenger’s breakout from the Sienar hangar, Pritchard observes that the production sought to use the “best tool for the job,” often pairing ILM’s CG expertise alongside special effects supervisor Luke Murphy’s practical effects. Highlighting a shot where an Imperial range trooper takes aim at the Avenger, Pritchard beams, “There’s a huge staged explosion along the back wall that was done by hanging a line of charges. They explode in sequence, so they explode outwards from the center. That in itself is impressive because it gives us some great visual reference to go on and actual practical elements to incorporate into the final comp.

(Credit: ILM & Lucasfilm).

“There’s a significant amount of work involved in painting out the actual charges and all the little fragments that get blown off properly, as well. It gives you such a great base to work off when you’re putting together a shot like that,” continues Pritchard, who then shifts focus to the Avenger’s weapons blasting through the hangar. “A lot of these explosions are practical, but we’ve enhanced them by adding sparks and additional explosions in CG. Making all this work seamlessly is a great testament to the comp and effects team.”

Compositing supervisor Claudio Bassi concurs, believing that the practical effects supplied ILM with valuable reference for lighting purposes. Bassi also states that, despite the presence of the practical Avenger, the TIE’s wings during Niya’s (Rachelle Diedericks) inspection are actually CG. “The hangar set didn’t have a roof, so we often replaced the wings as it was easier to integrate the CG roof.” Although the hangar set was extremely large, Pritchard highlights the fact that ILM had to extend the hangar to an even greater width, and the entirety of the front section that opens toward the snowy exterior is also CG.

Maintaining a Match

Of course, having both a practical and digital Avenger presented its own challenges when it came to assuring that the details matched, particularly regarding how much damage the CG version sustained at Sienar in comparison to the practical model that was filmed on the Yavin 4 set. “Working with the art department, we knew that the Avenger had gone through the dogfight at Sienar base and should have scorch marks where lasers had hit it,” Leo remembers. “We counted the number of rockets it fired in the first sequence because there’s no way for him to restock in space. We took four specific missiles off the practical build, which we then reversed in digital effects to choose the four that he fires in the opening sequence.”

Hugueniot emphasizes that the same held true for the havoc wrought upon the Sienar hangar itself, commenting, “Not all shots are worked on one after another in story order. There’s a big job of keeping track of what’s going on in every shot. All the scorch marks on the walls, everything that’s been knocked down from the ceiling, and which lights are working in each shot. That was quite a job [laughs].” Bassi agrees, divulging, “We kept track of the marks where the Avenger scratched the floor and which lights broke, and we had a system to recognize them in shot order.”

(Credit: ILM & Lucasfilm).

That level of realism was reflected in animation supervisor Mathieu Vig’s mission to make the Avenger look “heavy and dangerous, and as if it’s made of metal, not just pixels.” Having the Avenger scrape along the deck helped achieve this. “We’re used to seeing them [TIE fighters] flying very gracefully,” Vig explains. “Usually, we don’t animate them bumping around, so setting the weight is harder than you might think. In a hangar setting, there are so many physical elements to consider, such as the actor in the cockpit doing specific movements that we have to take into account. All of this is a carefully interlocking puzzle.”

Diving Into the Details

While analyzing the projectiles under the practical Avenger’s wings to model them for its digital counterpart, Owen Rachel recognized an intriguing connection. “When we were trying to work out how the missiles fire, we realized the design on the set was similar to those seen in Star Wars: A New Hope [1977], as they go into the exhaust port on the Death Star,” conveys Rachel. As it turns out, the design was a one-to-one match, so Rachel subsequently modeled the Avenger’s projectiles after the proton torpedoes that Luke Skywalker (Mark Hamill) fired at the first Death Star. Effort was also invested in preventing the Avenger’s Gatling-style spray of laser fire from appearing as though it simply hovered in mid-air. “We gave them a bit of an offset and some randomness in both their position in the stream and in their x- and y-axis, so we could get more chaos into that stream,” Pritchard elaborates.

Even the red light that flashes along with the hangar’s klaxon alarm was not nearly as simple as one might assume. ILM had to maintain a perfect rhythm between the flashing lights and klaxon, occasionally analyzing where the visual effects team needed to extend or shift the red light so it all remained in sync. “Compositing is basically the last step in the visual effects pipeline that ensures that all elements are integrated and the CG matches with the plate that has been shot on set,” describes Bassi, who worked alongside Pritchard to successfully pitch the idea that the overall light energy of the hangar would get progressively darker and moodier as the Avenger knocked lights off the ceiling.

A live-action plate (top) was captured on the set with practical explosion elements, which were later integrated with ILM’s work (Credit: ILM & Lucasfilm).

Matching the interior views of the practical and CG Avenger cockpits proved to be another challenge. “ILM’s Vancouver studio did a hologram of Cassian in season one that was absolutely fantastic and so lifelike. We reused that for Cassian’s head,” Hugueniot recounts. Filming Diego Luna in the practical cockpit occurred toward the end of the shoot, as Leo clarifies, “That was the very last thing we shot on the whole project—Diego in a motion base cockpit that we could move around and rattle. I think Diego had a really good time shooting those bits [laughs].”

In Awe of the Avenger

With season two now streaming on Disney+, the visual effects team has been able to view the completed episodes and finally share their work on the TIE Avenger prototype with the world. “I couldn’t even tell which Avenger was CG and which wasn’t,” laughs Emma Ellul, referring to the Sienar sequence. “The blend between the hangar, the ship taking off, and the chaos unfolding is so seamless. It was such an exciting bit to watch.”

The audience’s overwhelmingly positive response to the opening scenes has been equally uplifting for ILM, as the sequence fulfilled what the team set out to do. “When we were talking in previs, part of the idea was that it should feel breathless. Every time Cassian has solved one problem, the next problem comes up. There should never be a moment for him to relax until it’s over,” says Mohen Leo, who praises the sound design provided by Skywalker Sound. “One of the things that always happens after we’re done but makes such an impact is the sound. The sound that Skywalker put to the engines, weapons, and all of that makes such a huge difference.”

The TIE Avenger’s action-packed escape consists of a relatively small amount of screen time—but as Leo and his team have outlined, ILM imbued each facet of the Avenger and its accompanying environments with an extraordinary amount of time, energy, and expertise. So, the next time you rewatch Andor, don’t be afraid to press pause amidst the thrilling moments and soak up the wonders that ILM worked to allow the Avenger to ascend to the stars.

Discover more about the visual effects of Andor on ILM.com:

“Like Eating an Elephant One Bite at a Time”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

“Let the Experts Be the Experts”: TJ Falls and Mohen Leo on the Visual Effects of ‘Andor’ Season 2

Read more about the making of Andor on StarWars.com.

Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.













In part one of a two-part story, the production’s visual effects producer and visual effects supervisor discuss the effort to create over 4,000 effects shots for the Emmy-nominated Lucasfilm series.

By Mark Newbold

“It was a good opportunity to expand our horizons,” says TJ Falls, vice president of visual effects at Lucasfilm, about the team’s work to create a grounded aesthetic for both seasons of Andor (2022-25). After Rogue One: A Star Wars Story (2016) established the tone for the adventures of Cassian Andor (Diego Luna), the Andor production opted to utilize a number of existing locations for filming in the United Kingdom and around the world. It was a tactic previous Star Wars productions also chose (for example, 1999’s Star Wars: The Phantom Menace traveled to Italy and the Caserta Palace for the interior of the Theed Palace on Naboo), but integrating these locations to such a degree was something new for Industrial Light & Magic, a choice Falls appreciates.

“It allowed us to go out in the world and find a real base reference,” explains Falls, who was also the overall visual effects producer for Andor. “That was something the team worked hard to capture. We’re actually there in the city or in the mountains, so it was wonderful to be able to tie real-world locations into our digital work.”

The debut season of Andor leaned heavily into this physical integration. But, with a very real-world, global pandemic happening around the production, season one had its international travel wings clipped, as Falls explains.

“We couldn’t travel, but we still managed to gather reference material, including some for the ship-breaking yards on Ferrix. For season two, we were fortunate enough to finally be able to travel, so we flew to Lake Como and the Italian Alps to capture plates for Ghorman, among other locations.”

The Mothma estate on Chandrilla utilized aerial plates shot in Spain (Credit: ILM & Lucasfilm).

Joining Falls, production visual effects supervisor Mohen Leo picks up the conversation.

“Being able to travel to Spain for a variety of locations on season two allowed [production designer and executive producer] Luke Hull to rely much more heavily on the look of existing locations that were compatible, particularly the Senate building. Once we did the first location scout at the City of Arts and Sciences in Valencia, we were looking around, thinking, ‘Wow, it looks like Coruscant already.’ That made a huge difference, having that basis, both for interior and exterior spaces, so we could then use visual effects to build on and make it feel like Star Wars.”

The practicalities of having a ready-built set in the form of an existing building clearly had their benefits. Still, the broader task of adding visual effects presented its own challenges, as Leo explains.

“One thing I took away from the project is to push as much as possible for real locations,” he says. “Using an existing building during a shoot allows people to make informed decisions that stick, because if you have something that already looks 50%, 60%, or 70% the way you want it to, everyone has the confidence to say ‘Okay, this is the frame that we want, and we understand that we’re going to put this building in the background. Also, you have the composition of the lighting and the weight of the architecture, which makes it much easier, rather than having a blank canvas in post-production and then debating what it should look like.

“For example,” Leo continues, “there were the mountains around Ghorman. A couple of people from the production team and I went to Italy and did a two-day helicopter shoot. We felt strongly that even those locations where we would never actually shoot with a full crew or with actors should be based very specifically on real landscapes. That allowed us to put the Star Wars architecture in there and have that foundation.”

With the tremendous amount of work required to bring these locations to life, the balance between real locations and visual effects is a delicate one, based on story requirements, budget, and time.

“When we go location scouting, I always ask the director of photography [for season two, Damián García, Christophe Nuyens, and Mark Patten], ‘What are we keeping from the location?’” says Leo. “Because there has to be value in us being there. We were on location in Spain, and a Coruscant scene was discussed, which involved two people standing by a railing, looking out across the fictional cityscape. If we’re going to replace the whole city, then we don’t need to shoot that in Spain.” If you want that view, we can shoot that back in London on a green screen set because it’s easier, and we’ll have more control over the lighting. That, for me, is the main thing, having a clear idea when you go on location of what we keep from the location, and why we are there?”

The original location plate (top) shot at the City of Arts and Sciences in Valencia, Spain opposite the final shot (bottom) with the Coruscant skyline (Credit: ILM & Lucasfilm).

The use of natural light throughout the series is even more impressive when considering the balance between physical structures and digital extensions. Bathing the action in brightness or shadow, regardless of where and how it was shot, Leo explains, is how this integration was managed.

“We work very closely with the DP on that,” says Leo. “There are scenes where people walk directly from a stage set in London onto something that’s on location in Valencia. In the context of the story, it feels like one continuous location, even though they were shot months apart in two different countries. Obviously, we take lots of photographic reference. We have the plates of the one side at hand when we’re doing the other, and we’re constantly checking to make sure things fit together. On this project, we had a plan for each of those things before we went on location and shot it. We’re not trying to force things together in post; they’re meant to go together.”

“That’s exactly it,” adds Falls. “It’s the collaboration with the DP and lighting team, but also with previs, with techvis, and knowing that we’re going from studio space to location space. We had the opportunity to plan that out very specifically, each step of the way. And what helped us succeed is that we had a plan, and we were able to push it through to the best of each department’s abilities to deliver on it.”

Having a plan is essential to any well-run production, and on a visual effects-heavy series like Andor, it’s even more vital. Managing the process requires unique skills and systems to marshal all the information and elements into one place, as Falls explains.

“You’ve got to manage all these people and figure out who’s doing what, breaking it down to what the responsibilities of each person are. You start with something that’s massive, and we start to split things up between our teams and vendors. Ghorman is primarily a Hybride sequence; we’ve got Scanline VFX dealing with Mina-Rau, and we work with [ILM visual effects supervisor] Scott Pritchard to ask how we’re going to slice up this pie.

“It’s like eating an elephant one bite at a time,” Falls adds with a smile. “That translates from the production side into post and dealing with our vendors, and it’s all about clear communication, having people that you can build a shorthand with and have trust with, and then let them do what they do and not overmanage it.”

Actor Joplin Sibtain (Brasso) atop the speeder prop rigged to a camera vehicle (top) with a final frame from Mina-Rau (below) (Credit: ILM & Lucasfilm).

Truly a mammoth task, but that’s just the start of it. “Then, each individual team brings their expertise to build it right back up the mountain,” Falls continues, “so that Mohen has the opportunity to have that creative outlook over everything, I make sure it’s moving at the pace that it’s supposed to and that we’re hitting our schedule and staying on budget while making sure that [creator and showrunner] Tony Gilroy is getting what he wants for his vision of the show.”

There are many unsung heroes on any production, and amongst those are the production managers (including Frédérique Dupuis and Alyssa Cabaltera from ILM and Anina Walas from Lucasfilm, among others), who, on the visual effects team, juggle countless shots and give structure to the process for both the production and the partner studios. In its completed form, Andor might appear to be a graceful swan, but under the water’s surface, its legs are furiously kicking to propel it forward, as Mohen Leo elaborates.

“The visual effects production team has to keep track of over 4,000 shots, and each one of those shots has dozens and dozens of assets, be it art and reference or photography and scans, and they have to funnel all of that to where it needs to land and then send any questions back to me in a manageable way. I answer the creative questions. The logistical and organizational work is done by a team of incredibly diligent people without whom none of this would be possible.”

Along with this beehive of activity tracking all the elements, a database system, unique to each production, needs to be put in place.

“We find on each show that you have similar tool sets and similar ways of databasing things,” Falls says, “but you have to build it around the specific challenge of the show and the personalities involved. It’s about what Mohen likes and the types of data that we’re getting in.

“You have people like [on-set visual effects supervisor] Marcus Dryden, who was on set managing that side of things. His role was specific to season two, and it worked really well, that marriage of supervision responsibilities between me and our Lucasfilm production team and our production manager, and the coordinators building the database. That worked well for Mohen to get the notes in and out and track the scans and the data, but presenting it in forms that fit the specific way we were working with our vendors on this show. It wasn’t groundbreaking, but it was specific to what we needed.”

Palmo City’s central plaza on Ghorman utilized the massive backlot Pinewood Studios (top), and was later completed with visual effects (bottom) (Credit: ILM & Lucasfilm).

The database is set up, a system is in place, production managers have a process, and the elements are tracked as they come in. “It’s absolutely critical because it gives me the luxury to say, ‘Hey, where’s that scan from that location that we shot in that scene six months ago in Valencia?’” explains Falls. “And within 10 seconds, somebody will go, ‘Here it is.’ That shouldn’t be taken for granted because I’ve been on many shows where that can turn into an archaeological dig that can take days, or sometimes you don’t find it at all.”

With this bespoke Andor structure in place for season one, Leo could then take that and refine it even further for season two, a huge advantage, especially considering episodic television wasn’t a familiar environment for him.

“Season one was a big learning experience,” explains Leo. “I’d never done episodic television before; I’d only done movies, so dealing with that much content in such a compressed time was challenging. Also, the interaction with editorial is slightly different on episodic television. With every project, there’s an element of adjustment, but, there’s also an element of learning.”

“We had the luxury of a number of production staff carrying over from season one to season two,” says Falls. “So we learned in real time and adjusted things to fit. You could port it, but it wouldn’t necessarily work as succinctly as it does when it’s crafted around the group, and for season two in particular, I felt that we ended up crafting a really great system. The team was unbelievably adept in making sure that every person got exactly what they needed as quickly as humanly possible.”

The script is the tramline for everything that ends up on-screen, but in the realm of visual effects and working with the rest of the crew, there needs to be a clear understanding of what’s required and how to do it, something that comes from the top, as Leo explains.

“When we’re planning a shoot, we sit down with the director, the cinematographer, and the assistant director and ask, ‘What are you trying to achieve, what do we need to contribute in terms of the visual effects, and how do we make sure we get what we need during the shoot?’ Then we take meticulous notes.” 

However, it doesn’t always go as smoothly as planned. “We’re staring at the monitor as they’re shooting, but then somebody drops the microphone into frame, so that’s something we have to paint out,” Leo continues. “Maybe we have to do a set extension that we didn’t expect. Then there’s a step in post-production where, along with editorial, we’re looking at the early versions of the cuts, and that’s where we do something called the statement of work, where we look at each individual shot and go, ‘Okay, here’s all of the things we need to do for this particular shot across the various disciplines in order to complete it.”

An aerial view of the Ghorman set on the backlot (top) and final frame (bottom) (Credit: ILM & Lucasfilm).

Like all aspects of a production, visual effects come at a cost, with so many highly skilled experts putting their time and craft into a project. The team is responsible for both managing costs and ensuring that additional required effects can be covered within the allotted budget.

“There’s a constant ebb and flow of evaluation, so we work closely with editorial, seeing the working cuts,” Falls notes. “We go in with [editor] John Gilroy and they show us little pieces, and that allows the opportunity for some give and take as we evaluate things and look at shots and go, ‘Well, this is more than we had planned, or maybe there’s another sequence where they’re using less than what we had planned,’ and so there’s a little bit of horse-trading that happens.

“What we strive for,” Falls continues, “is to not say we can’t do something because it wasn’t planned. If there are 10 additional seconds needed in the show, how can we do it? Can we find a way that still delivers everything that’s needed, but also in line with the number of resources we allotted? Then, we’re back on budget, or I have to figure out how to take care of it, but we always start with what is the creative desire for the scene. How is it furthering the story? We don’t want anything that’s egregious or over the top just for the sake of being something flashy, so we have to make sure that everybody is in agreement that ‘Okay, it’s more than expected but it serves the story, it does what Tony needs, and now it’s our job to figure out how can we make it work.’ I think we did a pretty good job of that.”

Join us as we continue our conversation with TJ Falls and Mohen Leo to delve into the logistics of making Andor, the teamwork required to bring Cassian’s world to the screen, and their favorite moments from the second season.

Mark Newbold has contributed to Star Wars Insider magazine since 2006, is a 4-time Star Wars Celebration stage host, avid podcaster, and the Editor-in-Chief of FanthaTracks.com. Online since 1996. You can find this Hoopy frood online @Prefect_Timing.

Continuing a new series celebrating ILM’s 50-year legacy, featuring new interviews with ILM animation supervisors Rob Coleman, Mathieu Vig, and Stephen King.

By Jamie Benning

Ultraman and Nemi (Credit: Tsuburaya Productions & Netflix).

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and examples from Industrial Light & Magic’s 50 years of innovative storytelling. Read part one of this story here.

After Rango (2011), ILM continued to focus on photoreal visual effects work, but the idea of returning to feature animation remained alive in the background. The ambition had been there for some time.

“Jim Morris [former ILM president] was always pushing for ILM to do more feature animation,” explains Rob Coleman, creative director and animation supervisor at ILM’s Sydney studio. “I remember going on senior staff retreats for years, and every year he brought it up that that was a goal for him.”

At one stage during the early 2000s, an animated Frankenstein film was in development, though it never reached production. Despite that momentum, feature animation remained secondary to ILM’s core live-action visual effects business.

When Disney acquired Lucasfilm in 2012, ILM found itself part of a larger family including not just Lucasfilm Animation, but also two giants of feature animation – Walt Disney Animation Studios and Pixar, the latter an outgrowth of a former Lucasfilm division. With such formidable in-house animation studios under the same corporate umbrella, the idea of ILM producing its own fully-animated features inevitably became more complex. For the time being, ILM leaned into its core strength: pioneering visual effects work that has long been integral to live-action storytelling.But then…“People weren’t shooting movies,” Coleman recalls. “The pandemic opened a door.” That led to renewed interest in feature animation from partner film studios. Soon, both Ultraman: Rising (2024) and Transformers One (2024) were underway.

A Return to Feature Animation with ‘Ultraman: Rising’

For decades, ILM had been at the forefront of visual-effects-based animation, but Ultraman: Rising marked a shift – embracing stylization while maintaining strong, character-driven storytelling.

Animation supervisor Mathieu Vig notes the challenge of moving from photorealistic creatures to a more expressive, feature animation style. “That was a very interesting challenge,” he tells ILM.com. “First of all, because many were eager to go back to feature animation. But a lot of people had never worked in feature animation, me included. So that was definitely a bit of a scary enterprise after all of these photoreal creatures and characters.”

Many of the animators came from big, effects-heavy projects and initially expected Ultraman to follow suit. “I think we were all expecting the movie to be about that. And we were ready for it. Then we realized it was not about that at all,” says Vig.

Meeting directors Shannon Tindle and John Aoshima helped align the team with the film’s more emotional and grounded tone. “They put me at ease very quickly,” notes Vig. “Because I realized how caring and how clear they were about what they wanted from me as an animation supervisor. They wanted to meet everybody. To talk to the team. They were both so clear and detailed. That way, we could focus on – does the animation feel true? Does it feel rehearsed or active?”

The directors emphasized performance-based animation first and foremost, even referencing unexpected inspirations like Kramer vs. Kramer (1979) to highlight the film’s emotional depth. “Despite the kaiju-sized spectacle, Ultraman: Rising wasn’t just about action,” Vig explains. “It was a story about family, identity, and connection. We wanted and needed to have believable characters, quite subtle acting. We wanted an interesting mix of something that looks stylized but at the same time has so much heart and groundedness. The animation reviews were always about character development. There was great trust on both sides.”

Ultraman does battle with Gigantron (Credit: Tsuburaya Productions & Netflix).

One of the defining aspects of the animation ethos is attention to imperfection – the small hesitations, twitches, and unplanned gestures that make performances feel real. “We always wanted to sneak in as much as possible. A little dirt, little accidents, a little hesitation when you grab something, scratching yourself when you’re confused,” Vig says. “Sometimes it was just a little bit too clean, a little bit too perfect. And we said, ‘Here you can add some very fine little moments where you can break the perfect choreography.’” Even quiet, dialogue-driven moments are given space to breathe.

“There’s one shot in particular that I really love,” he continues, “which is when Ken and Ami are talking in the restaurant and eating the curry. One-minute shots of Ken, explaining his life to Ami, and Ami listening. And again, nothing happens, but I remember seeing the first blocking of this shot. I was kind of mesmerized by how beautifully ‘nothing happening’ was done. Obviously, it’s not ‘nothing.’ There was a story behind it, but to make that moment engrossing and entertaining was quite something.”

This drive for grounded performance often meant starting from realism, then dialing it back into a stylized world. It became a creative muscle that benefited both the film and the artists.

“We always started with realistic acting and then tried to bring it back down to a feature animation, Ultraman style,” adds Vig. “If the whole team were a classically trained feature animation team, we would have probably worked in the opposite way. I think it’s a very good exercise, and it totally benefits us for future work in visual effects realism because we all went through this process of filtering the shot back to its essence, rather than saying, ‘I’m just going to fill it up with animation.’ We’ve been spoiled. I hope we can be spoiled again. Whether it’s robots, giant kaijus, whatever else, if you have these living, breathing characters, we can do them at ILM. And we’d all love to do more.” Ultraman: Rising wasn’t just a return to feature animation for ILM – it was a chance to apply decades of performance-focused visual effects expertise to a new kind of storytelling, and to remind themselves, and audiences, what’s possible when stylization and sincerity meet on screen.

Building an Animated Cybertron: ‘Transformers One’

For Rob Coleman, Transformers One marked both a creative opportunity and a personal return. Having previously worked as animation director on Happy Feet Two (2011) and as head of animation at Animal Logic for The LEGO Movie (2014), he was drawn back to ILM by a renewed promise: that the studio would once again pursue full-length animated storytelling alongside its groundbreaking visual effects work. “ILM was going to be doing animated features as well as visual effects,” he explains to ILM.com. “That’s what enticed me back.”

Unlike the live-action Transformers films, which blended human characters with visual effects, Transformers One is set entirely on Cybertron. The film focuses on the emotional backstory of two iconic characters, in a world without any human frame of reference.

“Director Josh Cooley made it clear from the beginning – this wasn’t part of the Michael Bay universe,” Coleman said. “It was an origin story about two friends, basically brothers, who, because of life decisions, end up on very different paths.”

A group of Autobots (Credit: Paramount).

This character-driven approach brought performance to the forefront of the animation process. ILM animation supervisor Stephen King emphasizes the importance of expressing emotional depth without relying solely on dialogue. “It was essential to Josh that the subtlety and the nonverbal acting was just as important as what they were talking about in the dialogue,” King tells ILM.com. “In order for an audience to connect to an animated character, you have to bring them to life and make the audience believe that they’re thinking.”

That philosophy extended to every aspect of the film’s design and animation style. For Coleman, making the robots believable also meant starting with their inner life, not just their external mechanics. “It was key that the audience think they were looking at sentient robots,” he notes. “We always thought about the life spark inside – the character’s soul.”

To support this, ILM developed new tools and techniques. Their facial animation system was rebuilt from the ground up, allowing animators more expressive control while maintaining the precision required for robotic characters. “We really tried to get the facial performance to be as emotional and realistic as possible,” King says, “but then going, well, how can we make it robotic? We added these little robotic movements into the eyes and treated them like camera apertures and shutters.“By rebuilding the facial system, it gave animators a lot more freedom to move things around,” he adds. “Transformers One was all keyframe animated. For character performances, that’s where I want to be.”

Cooley’s background at Pixar helped shape the film’s animation language, particularly in its reliance on visual storytelling and expressive silence. “Very quickly we talked about non-verbal performances, the importance of eye animation, and his desire to play the whole third act, at least in test screenings, with no sound, completely in pantomime,” Coleman recalls. “I was like, yes, yes, and yes. Okay, you and I are going to get along just fine.”

The choice to exclude human characters offered an unexpected advantage: Without the need to establish scale or interaction with live-action actors, the animators were free to define their own physical rules for the world of Cybertron.

Optimus Prime (Credit: Paramount).

“Not having humans in our movie actually was a great plus for us,” King says. “The Transformers being 24 feet tall doesn’t mean anything to the characters, because that’s just how tall they are. That’s the world that they live in.”

To make the robotic characters feel nuanced and alive, the animation team relied heavily on physical reference. The animators themselves brought an additional layer of ownership to each shot.

“One of the great things about the movie is that all the reference was done by the animators themselves,” King explains. “Every shot, animators would act themselves or they’d get someone else to act out for them – and they would be able to put those performances into the character.”Even the mechanics of transformation – an iconic feature of the franchise – were reimagined through the lens of character logic and day-to-day function. “It was important to the director that this is like breathing for them – this is part of their day-to-day life,” says King. “So, we don’t need a five-second transformation every time. It’s what’s efficient for them, like getting on with their day.”

The result is a film that struck a chord with both critics and fans. Reviewers praised Transformers One for its emotional depth, strong character focus, and thoughtful storytelling, a refreshing change of pace for the franchise. Audiences responded just as warmly, celebrating its mix of high-octane action, humor, and heart. It is a reminder that even in a universe of sentient robots and shifting metal, the most powerful transformations happen within.

The Future of ILM Animation

With Transformers One and Ultraman: Rising showcasing ILM’s renewed investment in feature animation, the studio is now well-positioned to explore new creative territory. “There’s great interest,” Coleman says. “We’re just waiting for the right projects to land and get green-lit, but there’s certainly an appetite.”

“What this year [2024] has done with Ultraman and Transformers has really put ILM at the forefront of people’s minds,” King adds. “They’re calling cards to creators to say, ‘We can do whatever you want.’”

For Vig, the excitement lies in ILM’s ability to blend visual effects expertise with expressive storytelling. “Whether these guys are robots, giant kaiju, or something else, at the heart, if you have well-rounded, breathing characters, we can do them. And we’d all love to do more of it.”

From stop-motion animated creatures to fully animated features, Industrial Light & Magic’s journey has been one of constant reinvention and evolution. With its expanding tool kit and growing focus on animated storytelling, the studio’s influence is set to shape the next era of animation and visual effects.

ILM’s legacy in animation is secure, built on decades of innovation, artistry, and risk-taking. But the next chapter in animated storytelling is already underway, evolving frame by frame.

Learn more about the creation of Ultraman: Rising and Transformers One on Lighter Darker: The ILM Podcast.

Read more about Ultraman: Rising on ILM.com.

Check out Transformers One concept art from the ILM Art Department on ILM.com.

Read more stories from our 50th anniversary series, “ILM Evolutions”:

ILM Evolutions: Animation, From Rotoscoping to ‘Rango’

ILM Evolutions: Pushing the Boundaries of Interactive Experiences

Jamie Benning is a filmmaker, author, and podcaster with a lifelong passion for sci-fi and fantasy cinema. He hosts The Filmumentaries Podcast, featuring twice-monthly interviews with behind-the-scenes artists. Visit Filmumentaries.com or find him on X (@jamieswb) and @filmumentaries on Threads, Instagram, and Facebook.

The beloved Disney character can now interact with fans in a whole new way.

By Patrick Doyle

When Lilo & Stitch (2025) returned to the spotlight with its recent live action release, fans were treated to more than just a nostalgic trip to Hawaii, they got to interact with Stitch himself in real life.

In a groundbreaking collaboration between Industrial Light & Magic, the Walt Disney Studios, and Skywalker Sound, audiences around the world experienced a completely new way to interact with Stitch. Fans can now go behind the scenes of this unforgettable moment with a newly released making-of video that showcases the magic behind the real-time Stitch activation.

Making Magic in Real Time

“This wasn’t just about showcasing technology,” said Alyssa Finley, executive producer of the real-time Stitch experience at ILM. “It was about deepening the connection between fans and a character they love. Seeing people dance with Stitch and ask him questions live was pure joy.”

From the moment real-time Stitch hit Disney Studios’ TikTok and Instagram accounts, it was clear something special was happening. In the days leading up to the premiere, Stitch engaged with fans in real-time, offering shoutouts, surprise cameos, and plenty of chaotic dance-offs that made waves across social media.

But the fun didn’t stop there. Stitch also made an appearance at the film’s press junket, chatting (yes, chatting!) with reporters from around the globe and generating viral clips that quickly spread online.

A Blue Carpet Experience to Remember

At the Lilo & Stitch premiere in Los Angeles, fans and celebrities alike had the chance to interact with Stitch live. Whether it was asking him questions, dancing together, or witnessing his signature mischief, the experience felt spontaneous, playful and, most importantly, real.

All of this was powered by ILM’s cutting-edge performance capture and real-time animation pipeline, seamlessly integrated with support from Disney Studios and the brilliant audio minds at Skywalker Sound.

“Stitch has always held a special place in the hearts of fans around the world,” said Jason Eskin, vice president of marketing at Disney Studios. “Seeing fans light up when Stitch talked to them in real life was a reminder of why we create these moments. It was truly Disney magic, made possible through ILM’s incredible innovation.”

Bring Stitch Home

Lilo & Stitch is now available on digital and will be released on Blu-ray on Aug. 26. Whether you’re revisiting the story or experiencing it for the first time, there’s never been a better moment to fall in love with Stitch all over again.

Patrick Doyle is a senior publicity manager at Industrial Light & Magic.

New series exploring ILM’s 50-year legacy kicks off with new interviews featuring original Star Wars animator Chris Cassidy and current ILM animation supervisors Rob Coleman and Hal Hickel.

By Jamie Benning

George Lucas reviews a visual effects shot with ILM crew during production of Star Wars: A New Hope.
From left: animator Peter Kuran, production coordinator Rose Duignan, director George Lucas, animation and rotoscope supervisor Adam Beckett during production of Star Wars: A New Hope (1977) (Credit: ILM & Lucasfilm).

“ILM Evolutions” is an ILM.com exclusive series exploring a range of visual effects disciplines and highlights from Industrial Light & Magic’s first 50 years of innovative storytelling.

Animation has been woven into the DNA of Industrial Light & Magic’s story since its earliest days. From utilizing legacy techniques in Star Wars: A New Hope (1977) to the groundbreaking blend of live-action and animation in Who Framed Roger Rabbit (1988), ILM has continually redefined the possibilities of visual storytelling.

In this two-part article, we explore ILM’s journey from early work with rotoscoping, stop-motion, and go-motion to the development of sophisticated digital character animation in Jurassic Park (1993), the Star Wars prequel trilogy, and beyond. Part one focuses on the key innovations that culminated in Rango (2011), ILM’s first fully animated feature film. Part two examines how the studio expanded on these foundations in Transformers One and Ultraman: Rising (both 2024), solidifying its role as a leader not only in visual effects but also in feature animation.

Early Innovations and Handcrafted Beginnings

In 1975, as Star Wars, later retitled Star Wars: A New Hope, entered production, Industrial Light & Magic was a fledgling outfit assembled to help realize George Lucas’s ambitious vision. Animation quickly proved essential to the storytelling – Lucas’s needs were varied, including spaceship models firing laser bolts, glowing lightsaber blades, a holographic chess game, and stylized targeting displays.

To create the signature blaster bolts, California Institute of the Arts graduate Adam Beckett was hired in July 1975 to lead a small team in creating the animation and rotoscoping – including a young Peter Kuran. “I was initially shooting wedges and different colors for the laser beams and stuff like that. I was learning to use the equipment. We all were,” Kuran told The Filmumentaries Podcast.

“I actually did the first perspective beams,” said Kuran. “What was being tested was just kind of like back and forth – no perspective on it. I had suggested that we try that, and I actually got a very chilly response. So I decided to stay late one night and do a test and took it to the lab myself. It ran as a daily the next day, and [visual effects supervisor] John Dykstra liked it, so I wound up being the chief of that, at least for the time being.”

The iconic lightsaber effects were outsourced to Van Der Veer Photo Effects for the first film but later brought in-house at ILM. The process began by generating mattes from the live-action prop blades. Early experiments with retroreflective material and spinning poles proved too complex and were eventually streamlined. The mattes were rephotographed and colored frame by frame, with hues used to help audiences distinguish between each character’s weapon – blue for Obi-Wan Kenobi, red for Darth Vader – setting the look for the Star Wars saga for decades to come.

Lightsabers were created with hand-drawn animation in the original Star Wars trilogy, as seen here with Obi-Wan Kenobi (right, Alec Guinness) and Darth Vader (Bob Anderson/James Earl Jones) in Star Wars: A New Hope (Credit: ILM & Lucasfilm).

“At first, ILM didn’t have the resources to do all the opticals themselves,” animator Chris Casady tells ILM.com. “They sent shots out to Van Der Veer, Cinema Research, and Modern Film Effects. Those places were the old guard – they’d done work on Logan’s Run (1976), Soylent Green (1973), that kind of thing.

“But the goal was always to bring everything in-house,” Casady adds. “And once ILM got the optical department up and running in Van Nuys, the quality jumped. We had more control, and it just looked better.”

Beckett, as described by Casady, “was without a doubt a genius. Adam was extremely brilliant. He wanted to be able to put some of his psychedelic style into Star Wars. He thought it was almost an obligation to one-up 2001: A Space Odyssey [1968]. But Lucas wanted something more realistic.”

Casady noted Beckett’s work on the Death Star superlaser charging sequence, explaining that “Adam did a tremendous amount of work putting together that Death Star laser tunnel shot – all those rings and green things flashing down the middle. It’s built up of multiple passes, multiple exposures, multiple pieces of artwork.” The platform on which the live-action actors were standing was completely hand-drawn by Peter Kuran.

Casady added that “Adam’s signature work is the electrocution of R2-D2,” an entirely hand-drawn effect requiring precision to make the electricity feel convincing on screen.

“I really was brought in at a grunt level to make garbage mattes on the animation stand at night to free up the VistaVision cameras in the daytime,” Casady explained. “Every time they filmed the spaceship on stage … everything outside the blue is considered garbage; it’s got to be masked out. So, my job was to make this matte and block out the garbage.

“On film, my mattes fell below the threshold of black, so it became black,” Casady continues. “Famously, when the film was first released on VHS … my mattes were visible in the negative. … The audience saw my garbage mattes as irregular shapes that jumped every six or eight frames. So that’s the only time people got to see my work on the film!”The animation team also solved another subtle but crucial challenge: making the miniature spaceship models feel more plausible in their scenes.

“There was a shot of a TIE Fighter flying past the camera, and they were concerned it looked too flat,” said Casady. “So they asked if we could paint in some reflections – highlights that would suggest the ship was catching light from the environment. It wasn’t baked into the model photography, so we had to add those glints manually, frame by frame, right onto the animation cels. Just little touches of light to make it feel like the ship belonged in that space.”

Animation and rotoscope supervisor Peter Kuran works with an animation camera during production of Star Wars: The Empire Strikes Back (1980) (Credit: Terry Chostner & ILM).

Kuran told The Filmumentaries Podcast, “I just thought that that was something that was needed.”

By the time Star Wars: Return of the Jedi (1983) came around, ILM was called on to create yet another iconic animated visual effect: Emperor Palpatine’s Force lightning. Composed of hand-drawn electrical arcs, the effect required animator Terry Windell to conjure a sense of living, dangerous energy – a visual shorthand for the raw power of the dark side. During his career, Windell brought his animation skills to Poltergeist (1982) and Ghostbusters (1984), among many others.

Though Peter Kuran had since left ILM, his company, Visual Concept Engineering, took on the painstaking task of rotoscoping each frame of the lightsaber combat between Luke and Vader. In total, 102 lightsaber shots were completed for the final film in the trilogy.

While rotoscoping and hand-drawn animation effects remained essential throughout the early 1980s, ILM was already looking ahead, seeking ways to evolve another time-honored technique: stop-motion animation.

As with the lightsabers and blaster bolts, the Emperor’s “Force lightning” in Star Wars: Return of the Jedi (1983) was also created with hand-drawn animation (Credit: ILM & Lucasfilm).

The Rise of Go-Motion

Before work began on Return of the Jedi, “Go-Motion” – a breakthrough in dimensional animation pioneered by ILM’s Dennis Muren, Phil Tippett, Stuart Ziff, and Tom St. Amand – offered a major refinement to traditional stop-motion by introducing motion blur, an effect crucial to achieving realistic movement. Unlike standard stop-motion, where models remain static during each frame’s exposure, go-motion employs stepper motors driven by a motion-control system. These motors subtly shift the puppet during the open-shutter phase, simulating the kind of motion blur found in live-action 24fps cinematography.

“The significance is that we got it working,” Ziff told Cinefex, downplaying the complexity of a system that required months of development before the first usable shot could be captured.

First explored during production on Star Wars: The Empire Strikes Back (1980) and fully realized on Dragonslayer (1981), the process eliminated the telltale staccato of conventional stop-motion. 

Ziff’s engineering expertise led to the development of a modular rig dubbed the “Dragon Mover,” which connected to the model’s limbs via rods and enabled precise, repeatable motion sequences. Tippett, St. Amand, and Ken Ralston meticulously animated both walking and flying versions of the puppet, blending mechanical precision with handcrafted nuance.

“We started off with some of the more complicated shots,” Tippett told Cinefex, recalling the weeks spent programming movement cycles before finally achieving a fluid, natural gait. This meant that the process became easier over time, a testament to the artists’ dual roles as problem solvers. The result was a new level of fluidity and realism, particularly evident in the scenes featuring the film’s dragon, Vermithrax Pejorative.

The Vermithrax Pejorative in Dragonslayer (1981) (Credit: ILM & Paramount).

Blending Animation with Live-Action: A New Frontier

ILM’s reputation for innovation took a significant leap forward with Who Framed Roger Rabbit. Directed by Robert Zemeckis, the film demanded the seamless integration of hand-drawn, cel-animated characters with live-action performances and practical on-set effects. ILM’s task was to anchor the animated characters convincingly in the real world.

Visual effects supervisor Ken Ralston oversaw the technical and creative challenges of making cartoon characters interact believably with real environments. “The animation had to exist in a real world, with real lighting, perspective, and interaction. That had never been done before at this level,” Ralston told Cinefex.

“It was great for me because I am a huge fan of those early cartoons by the great Warner Brothers directors, Tex Avery and Chuck Jones. And when that showed up with the intent that Bob [Zemeckis]  wanted for it, man, that was a match made in heaven. And it was brutal, but it was great at the same time. It keeps you going. And when you see results on something that’s finally coming together, it’s a blast,” Ralston explained to The Filmumentaries Podcast.

Marking a turning point in hybrid filmmaking, they also decided to discard the traditional locked-off camera in favor of dynamic movement. To support this, ILM developed new methods to track live-action camera motion and translate it into data that animators could use to maintain consistent character positioning and perspective. “The opening camera crane shot proved to be historic. … No one had ever done a crane drop with a live-action camera and planted an animated character firmly on the ground,” Zemeckis recalled to Cinefex.

An animation cel from Who Framed Roger Rabbit (1988), created by the team supervised by Richard Williams. ILM was then responsible for compositing the animated characters with the live-action footage (Credit: ILM & Disney).

ILM and the special effects team constructed practical rigs to simulate interactions between live-action props and invisible cartoon characters. In one sequence, when Roger Rabbit turns a water faucet, a hidden mechanism releases a perfectly timed spray – a practical effect used to sell the interaction.

To match the shifting light within live-action environments, ILM tracked moving shadows and highlights, ensuring the animated characters were illuminated just like the actors. “If a light in the scene was swinging, … then the Toon characters would have to be lit in exactly the same way,” said Ralston. Animators relied on detailed lighting references to maintain visual consistency frame by frame.

Performance presented its own challenges. Bob Hoskins, cast as Eddie Valiant, was required to act opposite characters that weren’t physically present. “What I had to do was spend hours developing a technique to actually see, hallucinate, virtually to conjure these characters up,” he told Cinefex. To assist, Charles Fleischer, the voice of Roger Rabbit, wore a full Roger costume off-camera and delivered his lines live. “Although he was on the other side of the camera, I was able to talk to him as if he were right next to me. We could even ad-lib together,” Hoskins said.

After principal photography wrapped, ILM tackled the complex process of optical compositing while Richard Williams’s animation team in London produced the character animation. ILM integrated those elements into the live-action footage. “Every frame had to go through multiple passes to create tone mattes, shadow mattes, and interactive lighting effects. It wasn’t just a matter of drawing the character,” explained optical supervisor Edward Jones. “Every single frame had to be drawn, rechecked, and composited with multiple elements to make sure the animation fit seamlessly into the live-action,” Zemeckis recalled.

The result was a groundbreaking fusion of animation and visual effects that redefined the possibilities in cinematic storytelling. It was a winning combination of traditional techniques and innovation that was widely praised. The film won Best Visual Effects and a Special Achievement Award at the 1989 Academy Awards. Many saw the film as the zenith of the photochemical era, even to the extent that it was perceived as too complex to repeat.

In fact, it wasn’t until a decade later that ILM revisited this hybrid format with The Adventures of Rocky and Bullwinkle (2000), applying many of the same techniques with enhanced digital compositing tools to a new generation of animated characters.

Actor Bob Hoskins (Eddie Valiant) is suspended before a blue screen on ILM’s main stage. In this sequence, his character interacts with animated co-stars Bugs Bunny and Mickey Mouse (Credit: ILM).

When Dinosaurs Ruled the Visual Effects World

While go-motion had proven a valuable innovation throughout the 1980s, it was the advent of computer graphics (CG) character animation that truly revolutionized ILM’s approach in the 1990s. In the last year of the decade, ILM laid the groundwork on James Cameron’s The Abyss (1989), animating the fully CG pseudopod – a water-based, tentacle-like entity. For Cameron’s next film, Terminator 2: Judgment Day (1991), ILM once again raised the bar with the liquid metal T-1000.

It was the digital dinosaurs in Jurassic Park that marked a true turning point – not just in terms of spectacle – but as a clear signal that traditional methods like stop-motion and go-motion were being eclipsed by a new era of photorealistic CG. ILM animator Steve Williams, who had previously worked with Mark Dippé on The Abyss and Terminator 2, pushed the idea of fully computer-rendered dinosaurs further. The results were astonishing. Steven Spielberg’s action-horror hybrid delivered creatures that felt real. Animals that moved and breathed with skin that stretched and muscles that flexed.

As a veteran stop-motion animator, Phil Tippett famously quipped at the time: “I’ve just become extinct.” The line – part joke, part reality – captured the profound shift unfolding across visual effects departments. Tippett’s line was given to the film’s Dr. Malcom, played by Jeff Goldblum.

A computer-graphics Brachiosaurus seen with live-action actors in the foreground in Jurassic Park (1993) (Credit: ILM & Universal).

By the time Jurassic Park hit screens, the industry had begun pivoting decisively toward digital techniques, a shift witnessed firsthand by animator Rob Coleman.

“There were only 6 animators at ILM for Jurassic Park,” he tells ILM.com. “It was the film that inspired me to cut my reel and send it in. … And I came in as ILM’s animator number 9 in October of ‘93 (4 months after the film’s release) when it was still very early days for computer graphics.”

To bridge the gap between stop-motion and computer animation, the team developed a hybrid technique known as the Dinosaur Input Device or D.I.D. This setup used a dinosaur armature fitted with sensors and encoders, allowing animators to physically manipulate the model while their movements were captured and translated into digital data. The goal was to combine the skill and experience of the traditional animators and strengths of the computer artists and technicians. While the results weren’t always ideal – much of the animation still had to be keyframed in the computer – it marked a pivotal step. The future of filmmaking was taking shape, frame by frame.

Animator Tom St. Amand (left) and lead animator Randy Dutra of the Tippett Studio pose with the Dinosaur Input Device (D.I.D.) used on Jurassic Park (Credit: ILM & Tippett Studio).

The Challenge of Digital Characters: The Star Wars Prequels

Following ILM’s work in the 1990s on films like The Flintstones (1994), Casper (1995), Forrest Gump (1994), and Jumanji (1995), George Lucas was getting ready to revisit the galaxy far, far away. This time, with a vision that demanded unprecedented integration of digital characters and live-action performances. The Star Wars prequels would become a proving ground for ILM’s rapidly expanding digital animation capabilities.

Leading that charge was Rob Coleman, by then an animation supervisor at ILM. He found himself tasked with something the company had never fully tackled before: nuanced, verbal performances from fully digital characters who needed to share the screen – and emotional space – with real actors.

“It was all those things, plus we didn’t have a staff that actually had spent their time learning how to do nuanced performances,” Coleman recalls. He would tell director Joe Johnston for Light & Magic Season 2 that it was Dragonheart (1996) that really set the groundwork. “That was a huge leap for us. George was watching, and when he saw Dragonheart, he said, … ‘We are ready to go.’

Draco the dragon (voiced by Sean Connery) flies towards Bowen (Dennis Quaid) in Dragonheart (1996) (Credit: ILM & Universal).

“Most of the people at ILM had been flying spaceships and doing robots and maybe having dinosaurs smash around,” Coleman adds, “but they weren’t doing verbal performances where they were to hold their own with Natalie Portman and Liam Neeson and Ewan McGregor.” And to bring multiple CG characters like Jar Jar Binks, Watto, and Sebulba to life in Star Wars: The Phantom Menace (1999), Coleman had to shift the team’s mindset. His growing team of 65 animators needed to think less like technicians and more like performers.

“We videotaped our actors so we had what they were doing physically, and we could look at them speaking to work out the lip sync. But pretty early on in Phantom Menace, I knew that I wanted to get into the subtext, not just the text. What’s going on inside the heads of the characters. If we could achieve that, we were gonna have believable performances, and the audience would have a connection with Watto, Jar Jar, Sebulba, and Boss Nass in that first film.”

The next major test came with Star Wars: Attack of the Clones (2002) and the digital resurrection of a beloved character: Yoda. Unlike Jar Jar or Watto, Yoda had already been established in the original trilogy as a practical puppet, sculpted by Stuart Freeborn and brought to life by puppeteer Frank Oz. Coleman’s team needed to preserve that legacy while updating the character with a broader range of expression.

“I went back and looked at Empire and it was nothing like I remembered because I’d grown up. It had changed what we expected,” Coleman says. “So what I was trying to achieve is what I remembered Yoda doing in terms of expressiveness and honoring how Frank moved him. Frank actually came by ILM, held up his hand, showed me the position of his fingers inside Yoda’s head. I had him pantomime some Yoda with me so I could see what he was doing.”

To ensure authenticity, Coleman and his team rigorously tested Yoda’s new digital incarnation. He recalls the moment he shared the first test with George Lucas. “There is footage of me presenting the first digital Yoda on the From Puppets to Pixels [2002] documentary. That is the real footage of me doing that, even though I asked the documentary not to shoot it. I’m so happy they did. I was really nervous, and I presented three speaking shots and three non-speaking shots on purpose because I was trying to show them that we could maintain performance without the crutch of dialogue. That was a focused decision because I knew from watching countless movies and TV, editors routinely cut to their action shots – the non-verbal reaction shot. I wanted to earn one of those, and we did.”

Jar Jar Binks (right, Ahmed Best) performs opposite Queen Amidala (Natalie Portman) in Star Wars: The Phantom Menace (1999) (Credit: ILM & Lucasfilm).

That approach paid off. One of Yoda’s most effective digital moments came not during a battle or speech but in a quiet reaction. “There’s a shot of Yoda in Palpatine’s office where Palpatine says something, Yoda’s leaving, and he turns, and he looks over his shoulder, and you can tell he doesn’t trust him,” Coleman notes. “And that’s all in facial performance, all keyframe, frame-by-frame animation. It ended up on the movie poster.”

Coleman’s work continued into Star Wars: Revenge of the Sith (2005), by which point ILM had solidified its reputation as a pioneer in digital character animation. The scope of the prequel work, in retrospect, still feels enormous to the animation director.“I kind of got swept up in it all. Jim Morris [ILM’s general manager from 1993 to 2005] had put me forward for the role. Jim had taken me aside, and he said, ‘I think you’ve got the right temperament to work with George.’ So he sent me over … and dropped me off in London for a two-week interview with George Lucas, which I passed.”

Decades later, Coleman is reflective about the experience. Even as ILM continued to push forward in their abilities to mimic life, it was paradoxically the artists themselves that felt like the imposters. “Twenty-five years on, it’s kind of surreal to think back that I actually did that. I know that’s me. There are pictures of a younger me doing it. And I have all the memories, but sometimes it feels like it was someone else.”

Animation director Rob Coleman at work on The Phantom Menace (Credit: ILM).

Cursed Flesh and Living Tentacles: The Pirates Breakthrough

When Pirates of the Caribbean: The Curse of the Black Pearl (2003) set sail, ILM faced a major challenge. Bringing the cursed crew of the Black Pearl to life wasn’t just about creating convincing skeletons – it was about making them believable next to live-action characters.

Hal Hickel, animation supervisor, explains to ILM.com that “It was a really complicated problem because the idea was that under moonlight these guys are skeletons, but in shadow, they’re flesh and blood.” Each shot became a complex blend of live-action photography and animation, requiring seamless transitions between the two. “You couldn’t just cut to them and show them in full skeletal form under neutral lighting,” he said. “It all had to be motivated by the lighting in the scene.”

The work paid off, but it was only the beginning. For the sequel, Pirates of the Caribbean: Dead Man’s Chest (2006), director Gore Verbinski raised the bar with Davy Jones and his crew. These characters were fully digital – and fully expected to carry the emotional weight of their scenes.

Speaking about Bill Nighy’s portrayal of Davy Jones, Hickel notes that “Bill gave such a brilliant performance. We didn’t want to lose any of the little stuff. The slight squint of an eye, the tiny sneer.” Rather than relying solely on motion capture, the team blended Nighy’s reference footage with keyframe animation, ensuring that none of his subtle acting choices were lost. “We wanted the tentacles to feel alive but they had to support the emotion in his face, not steal focus.”

Davy Jones (Bill Nighy) in Pirates of the Caribbean: Dead Man’s Chest (2006) (Credit: ILM & Disney).

Animating Davy Jones’s tentacle beard posed its own technical challenges. “It was a mix of hand animation and simulation,” Hickel explains. “We animated parts of it for performance reasons, but we also let physics take over for the secondary motion, so it didn’t look fake or overly choreographed.” This approach required close collaboration between animators, rigging artists, and the simulation team to keep everything feeling realistic and responsive.

The complexity of Davy Jones and his crew pushed ILM to overhaul their pipeline. “We had to rethink a lot of how we built and rendered these characters,” Hickel says. Advances made for Pirates laid the foundation for ILM’s later work on projects like Transformers (2007) and The Avengers (2012).

Beyond the technical achievements, Pirates also marked a shift in how digital characters were treated on screen. As Hickel puts it, “It wasn’t just about creating spectacle. Gore trusted us to handle real character beats with these CG characters. It was an amazing opportunity.” Through a mix of performance, artistry, and cutting-edge technology, ILM helped create one of cinema’s most memorable digital villains. They had steered animation into entirely new waters.

The Leap to Full-Length Animation: Rango

After working with Industrial Light & Magic on three Pirates of the Caribbean films, director Gore Verbinski approached the studio with an ambitious proposal: to produce a fully animated feature. He had been particularly impressed by ILM’s work on Davy Jones and believed the studio could bring that same level of sophistication to Rango – a surreal Western populated by anthropomorphic desert creatures.

“We approached Rango the way we approach live-action visual effects,” visual effects supervisor John Knoll told Cinefex, “building out environments with a cinematic mindset rather than adhering to the rigid, modular workflow of conventional animated features.”

A defining innovation was the film’s approach to lighting and cinematography. Renowned director of photography Roger Deakins consulted on the project, bringing principles of real-world filmmaking into the animated space. “We lit Rango the way we’d light a live-action film, with practical principles of cinematography in mind,” Deakins told Cinefex.

Rango‘s (2011) namesake, as voiced by Johnny Depp (Credit: ILM & Paramount).

ILM’s animation director, Hal Hickel, emphasized that they wanted the characters to inhabit their world with mass and texture. “We didn’t want our characters to feel overly polished or weightless,” he told Cinefex. “Gore wanted them to move with a slight awkwardness as if they truly existed in this dusty, unpredictable world.”

“He didn’t want to go head to head with Pixar or Disney or DreamWorks or Illumination. If they’re all over here, he wanted to go over there, aesthetically, in every way,” Hickel tells ILM.com. “Gore understood that the look of the film that he wanted to do was what we ended up calling ‘photographic.’ So not photoreal, but definitely not cartoony – the shot glass with whiskey in it, those kinds of things all had this patina of realism. So that seemed like a really good fit with us at ILM.”

Rather than using motion capture, Verbinski shot sessions with the actors performing together in a theatrical setting simply to inspire the animation. “It wasn’t about mapping motion one-to-one,” says Hickel. “It was about understanding the rhythm, the beats, the subtle mannerisms that would inform the final animated characters.” The result was a film that felt authored – visually distinct and emotionally resonant. For ILM, Rango marked another turning point.

“We knew this was an experiment,” said Knoll, “but we also knew it was an opportunity to redefine what ILM could do. Looking back, I think we did just that.”

Lead animator Maia Kayser at work on Rango (Credit: ILM).

Having left ILM before production on the film, Rob Coleman is still captivated by Rango. “It came about because John Knoll and Hal Hickel built a fantastic relationship with Gore Verbinski,” he says, “and they demonstrated to him through Pirates of the Caribbean that ILM had acting animators, and Gore is an actor’s director. They needed the right director with the right focus and the right mixture of talents and just bravado to say, ‘Yeah, we’re going to do this.’ And to hit ILM at the right time to make it, I think it’s still a marvel. I went back and watched it a couple years ago. It’s incredible what they did and what they achieved.”

“Every animator I know who worked on Rango had a ball and tells me continuously, ‘Gosh. Let’s get another Gore film going,’” says Hickel. “Yeah, they ate it up. He just really wanted people to feel like we were all filmmakers. You’re not the visual effects people up there, and I’m the filmmaker down here. We’re all filmmakers. We’re making this movie together.” That sense of collaboration was an ethos that ILM started in 1975 and continues to carry forward to this day.

Rango went on to win the Academy Award for Best Animated Feature in 2012.

Follow ILM’s continued journey in animated feature filmmaking in part two of this installment of ILM Evolutions.

Read more stories from our 50th anniversary series, “ILM Evolutions”:

ILM Evolutions: Pushing the Boundaries of Interactive Experiences

Jamie Benning is a filmmaker, author, and podcaster with a lifelong passion for sci-fi and fantasy cinema. He hosts The Filmumentaries Podcast, featuring twice-monthly interviews with behind-the-scenes artists. Visit Filmumentaries.com or find him on X (@jamieswb) and @filmumentaries on Threads, Instagram, and Facebook.