San Francisco News

50 Years | 500+ Film and TV credits | 135+ Awards

SINCE 1975

Visual effects supervisors Jay Cooper, Andrew Roberts, Charmaine Chan, and Ian Comley take us behind-the-scenes of an unusual visual effects challenge.

By Lucas O. Seastrom

Ever since George Lucas and John Dykstra sat down in 1975 to discuss the former’s vision of capturing dynamic aerial dogfights between miniature spaceships in Star Wars: A New Hope (1977), Industrial Light & Magic (ILM) has made an art of solving creative problems in close partnership with filmmakers. As Lucas’ vision challenged ILM’s capabilities nearly 50 years ago, The Creator (2023) writer/director Gareth Edwards proposed an unconventional approach to filmmaking that would keep the visual effects crew on their toes.  

Proof of Concept

Edwards first collaborated with ILM on Rogue One: A Star Wars Story (2016), channeling the same rebel spirit of Lucas’ A New Hope. Envisioning his own science-fiction tale in The Creator, he would also channel Lucas’ audacity for pushing the limits of ILM’s capabilities. It began some years ago when he asked ILM’s executive creative director John Knoll (who supervised the visual effects for Rogue One) if the company was able to assist with a test reel that would demonstrate Edwards’ vision for a movie about a futuristic Earth where humans and artificial-intelligence lived side-by-side.

“Gareth and his producer [Jim Spencer] went to Asia on what he described as a scout, but he also brought a camera along,” explains Jay Cooper, who would become The Creator’s overall visual effects supervisor for ILM. “He shot in a number of different locations to create a sort of think-piece, very documentary-style footage. Then he came to us asking to put some 50 shots together, which John supervised.”

Edwards provided his footage only. There was no accompanying data, no lidar scans, no HDRI captures of environments, none of the usual resources that visual effects artists rely upon. The challenge was to integrate digital elements – characters, vehicles, and locations – into the existing footage, including the replacement of real people, or components thereof, with robotic technology. “We tried to create rapid prototypes of what shots could look like by doing them in a more heavily-2D way,” Cooper explains. “We’d take frames, do a draw-over with James Clyne, who became the film’s production designer, and with a bit of fast projection work, get them into shots. We got a really convincing look with a modest amount of effort. Gareth explained that shots that usually take two or three months of work could be seen in three or four days.” 

The proof-of-concept not only sold Edwards’ backers to make the film, but gave ILM a model for developing effects on a feature-length scale in this unusual, after-the-fact method. The Creator would be shot primarily on location in Thailand with a small crew and fewer resources. “Gareth wanted to shoot this ambitious movie,” says Cooper. “The artwork was phenomenal, but the catch was that we’d be really uncomfortable because we wouldn’t be given the things we were used to. We wouldn’t stop for a clean pass every time. We wouldn’t always know what all the shots we’re going to be because those would be determined in the edit. There were enormous designs for the scope of the movie. This was a big swing, it was sink or swim. So off we went.”

On the Ground in Thailand

Edwards remained committed to maintaining a fast, improvisational shooting style, often handling the camera himself. He would not inhibit his ability to engage in the moment with his actors, who included John David Washington as the protagonist Joshua, a world-weary soldier in search of his lost love, and Madeleine Yuna Voyles as Alphie, an artificial simulant in the form of a young girl who acts as both the story’s heroine and MacGuffin. Instead of the usual small team of visual effects personnel, ILM would send just one representative to Thailand, visual effects supervisor Andrew Roberts.

Roberts would be responsible for both consulting with Edwards and crew, including cinematographers Greig Fraser and Oren Soffer, as well as capturing as much data for each respective shot as he possibly could. “I was there to help make sure that things were filmed in a way that would give ILM the best chance of producing great, photoreal work,” Roberts says. “I wasn’t going to get in Gareth’s way.

“Early on, we had scenes with robots and humans existing together,” Roberts continues. “I asked Gareth which of the actors would be made into robots so I could mark them. Even if we’re not putting them in the motion-capture suits, I could take measurements and make a turntable, all to give the team information. Gareth looked at me and said, ‘Don’t know.’ It wasn’t something he wanted to focus on. He would pick actors to make into robots later. I wasn’t sure how aggressively he was going to create negative space with these characters. It turned out that their bodies were more or less the same, and you’d mainly see the mechanism when it came to their arms and their heads. But I still didn’t know at the time, so I recorded the information about where Gareth was pointing the camera and determining what backgrounds I needed to capture to reconstruct a clean plate.”

Another major challenge involved the simulants, A.I. characters who appear human, save for the aft portion of their heads, which feature a bold mechanical structure. Edwards and Clyne had created initial concept art, but it was left to Roberts and Cooper to determine the best means to track the live actor’s facial movements onset in order to integrate digital components during post-production. 

“I think the movie doesn’t work at all if you can’t get a convincing Alphie,” says Cooper. “It’s where your eye is looking. There are 400 shots of her. In prep, I pitched the idea of putting a sock over her head and dressing the edges of where the contours are so that we know exactly how to define the delineation point between where her mechanical components connect to her skin. I asked about doing makeup to address the edges, and we could fill in the rest. Gareth said, ‘No, we’re not going to do that because I need her.’ When you’re working with a child actress, there’s only so many hours you can work. He wanted her onset for every minute she could be. 

“Then we had to figure out what we could do in terms of tracking dots that were low impact and didn’t interfere with the acting,” Cooper says. “I explained the ask to [layout supervisors] John Levin and Tim Dobbert, and said that I didn’t know exactly what the designs were going to be, but they said, ‘Well, let’s put some tracking dots on the bridge of her nose, one on the temple, a couple on her neck, and we think we can figure that out.’ So that’s what we did! [laughs] It’s a leap of faith.” Roberts then collaborated daily with the makeup department to place tracking dots on the simulant actors, each of whom required a unique arrangement because of their varying physiques.

“The benefit of having someone like Gareth is that he used to be a visual effects artist and he has a clear idea of what the end result will be,” explains Roberts. As an example, he explains how Edwards shot an early moment in the film when Joshua watches the suborbital ship NOMAD launch a missile at a group of small vessels just offshore. “Gareth knows that he wants NOMAD to be in frame, so he’ll frame for it and then tilt down to Joshua watching from the beach. Another director might be focused on the action in front of them, and in post they’ll ask if we can extend the frame and create a digital move. The majority of directors don’t think about those things in advance. So when I’m observing a shot like that where Gareth is tilting the camera, I’ll wait for the cut and ask him, ‘During that tilt, what are you seeing?’ Then I’ll make notes.”

After crisscrossing Thailand, often covering multiple locations in a single day, cast and crew traveled to Pinewood Studios in the United Kingdom, where ILM had contstructed a StageCraft volume as part of its virtual production toolkit. There, two major sequences were captured for the end of the film, when Joshua and Alphie board NOMAD. “It takes a lot of work to do StageCraft correctly,” notes Cooper, who used the tool for the first time on this show (as was the case for Edwards). “You have to be very careful that it’s the right fit. If I think about our goal as a movie, which was to always find real locations, there were only a couple of places where there was no equivalent location, and that is space. It made a lot of sense to use StageCraft for the NOMAD’s Biosphere environment and the Air Lock, where either the scope is so large that it would be cost-prohibitive to build a physical set, or the aesthetic goals would push you into doing a full bluescreen shot.”

A few smaller scenes were shot on an adjacent Pinewood stage equipped for traditional bluescreen or greenscreen, but as Roberts points out, the crew took the chance to innovate some distinct techniques. We had a scale portion of the missile that Joshua climbs on,” he explains. “We created interactive lighting for that by taking portions of the real-time NOMAD model from Gareth’s virtual production scouts, and animated them to enable these mechanisms pushing missiles into place. I had this little animated sequence, which I then rendered out as a black-and-white texture that had different layers of structure moving past, which imagined that the sun was out in space and these things were casting shadows. We connected that from my laptop to a 12K projector that was mounted on the set. So when John David is hanging on the exterior of the missile, we have real light interacting with him in the close-ups. That evolved quite organically.”

Altogether, 2022’s principal photography lasted some 80 days, not including an additional round of element shoots and pick-ups led by Edwards with an even smaller crew across multiple Asian countries. 

A Global Collaboration

ILM’s studios in San Francisco, London, and Sydney would each make significant contributions to The Creator, with additional support from the Vancouver studio and an array of vendors. In October of 2022, Edwards came to ILM San Francisco to screen a three-hour cut of the film. “Everyone came out recognizing that it was something different and special,” recalls London-based visual effects supervisor Charmaine Chan. “It was a lot more than we thought it was going to be. Originally, we estimated around 700 or 800 shots. Watching that cut, we knew there were more than twice as many. So the question was how to handle that and deliver on time, on budget, and at the quality we always want at ILM. We had to set guidelines with Gareth about how we’ll be able to get this film across the finish line, and he was very receptive to it.”

Cooper’s proposal was an unusual “three-strike system,” where Edwards would be given three opportunities across the life of a given shot to provide notes, allowing ILM to iterate with as much focus as possible on the key elements of that shot. “That’s the optimal structure to ensure that all the money goes into getting a clear direction for the shot,” Cooper notes. “We were probably only successful doing that about 70% of the time, but there were a healthy number of shots where, after we solved the questions about the simulants, for example, Gareth knew that if we kept to those standards, we wouldn’t be chasing really small details.”

The design evolution for the simulant head mechanics resulted in an elegant approach that felt almost human. Building on techniques first employed by ILM for The Irishman, the team was able to seamlessly blend the movements of the actor’s skin with the rigidity of the rear components. “We’re trying to empathize with these simulants and understand what they’re going through,” explains Chan. “When you first see one, it’s just another human being, then they turn to profile and you realize it’s something else. Because the performances are so good, whether it be Madeleine or Ken Watanabe [Harun], you’re focused on them and feeling their emotions and you forget about all that gear.”

Many subtleties were incorporated into the headgear to complement the performances, including character-specific details, such as the battle-worn tech of Harun’s components. The animation team were responsible for creating tiers of almost subliminal movements that reflected each simulant’s emotional state. “When Alphie stops the bomb robot, for example, that’s full pelt as the mechanics whir up, which includes wonderful sound effects,” explains Ian Comley, also a London-based visual effects supervisor. “For everything else, it’s a kind of Swiss watch, cogs and gears ticking, something always active, but in a more gentle way.”

Throughout post, the ILM crew enjoyed an extraordinary level of direct access to the director and production designer. “It can be pretty rare to feel like you’re a core member of the filmmaking team,” says Comley. “I can’t think of another film where the production designer has stayed on until the very last shot. To build every robot, simulant, vehicle, prop, and structure required a lot of design. Gareth knew what he wanted and had a great relationship with James, and we were a part of that. We took initial concepts from James, tried to riff off them, and fleshed them out into assets. We could then share back directly with James, who could go in and do paintovers. With direct access, there’s no diffusion of ideas. Instead, it’s collaborative filmmaking.”

To create the full-body A.I. characters as replacements for select live actors, ILM developed seven distinct robot designs following Edwards and Clynes’ visual methodology, which combined a 1980s technology aesthetic with organic, natural influences. Each design could be made unique with specific flairs, often informed by the individual character, such as with Amar Chadha Pata’s performance as Satra. 

“Amar is very expressive in his face,” explains Chan. “When he’s talking or thinking, his eyes and eyebrows say a lot. We captured that in the actor, but how do we present that in a robot? [Animation supervisor] Chris Potter was brilliant in suggesting all of these fine details, like in the eyes, which are very tiny on Satra. You can see a slight pupil and see eye darts when he’s thinking. The mouth was also slightly hinged, so all these little characteristics of Amar’s performance can come into this robot to show his emotions.”

Comely points out the “masterstroke” of Edwards’ decision to not decide on the robot characters while filming. “Even when it came to background characters, a typical film would decide who would be a robot and kit them up in mo-cap pajamas,” he explains. “None of that on this show. Gareth got naturalistic performances because people were just moving as people. If it was a scary scene, they acted scared with those fluid motions. No one had been told they were a robot and then acted twitchy or jittered, the kinds of things you might do. 

“It also gave ILM license to switch out anyone,” Comley continues. “If he had picked someone onset to be the robot, it might turn out that that person isn’t located where your eye goes in the shot. The real person you want to be a robot is on the other side. We had the freedom to do that, which was a real challenge, but we could decide with Gareth after the fact which ones to choose. As shots changed, we could keep adjusting. The matchmove and paint teams did a fantastic job. The performances were so grounded, and we did very little to change that. The last thing Gareth wanted was for us to take a brilliant natural performance and turn it into a stereotypical robot. It was mostly heads and arms. There are instances with full-body robots, but by and large, they were additions instead of replacements.”

The London crew under Chan and Comley’s supervision spent considerable time on act three aboard the NOMAD, where the key challenge was to create fully-CG assets and environments that felt akin to Edwards’ naturalistic shots on real-world locations. On a typical show, ILM often incorporates grain or lens flares to match the source photography, but for The Creator, they also helped influence creative decisions to help bridge the divide between Earth and space, including moving the NOMAD into lower orbit where more diverse colors and atmospheric elements could be incorporated. “It helped marry the story points where people on the ground are able to see NOMAD above,” as Comley notes.

Even in a traditional CG scenario, ILM found ways to empower Edwards’ freeform shooting style. “Besides the real-time rendering and LED walls, the StageCraft suite also includes virtual cam sessions,” explains Chan. “The whole exterior of NOMAD was pure CG, so Gareth was able to hold an iPad and look around to see the different sections of the ship and frame his shots, from the wings to the central section that we called the ‘bunny teeth.’ We saved so much time with Gareth being able to do that, rather than having us propose specific framing ideas. With Gareth being a visual effects artist, he just grabbed it and started making choices.”

At times, Edwards even embraced the most ordinary of methods to convey his vision. For the sequence when Joshua attempts to climb onto one of NOMAD’s towering missile silos, the director provided reference footage by “taking a wastepaper bin with a water bottle inside for the missile and a little LEGO figure taped on,” as Comley explains. “He shot it all with his iPhone. It had the same principles of photography that he’d applied in the v-cam. You have to feel like there is an operator discovering the events as they unfold. Gareth’s philosophy was often to think that the operator was hanging out of a fast-moving plane because the NOMAD is so big, that’s the only way you could do it.”

By the spring of 2023, ILM had completed some 1,700 shots for The Creator (a handful of which came from Edwards’ original test reel). “We made some good choices in terms of how to build this whole train set,” explains Cooper. “Maybe the most important one was that James Clyne had a concept team all through post-production. In visual effects, it can get expensive area is when you don’t know what you want, and you iterate multiple times and change directions. Normally there’s a bunch of concept art and you spend your time chasing that. We had existing concepts, but once the movie was shot, James kept reinterpreting it. When we’d land on an idea, we already knew the shot, the camera work, and we could deploy our resources accordingly. Sometimes it’s a 3D asset that we build because it’s going to be in 40 shots. Other times we take the art model from James’ team, put it into the shot, they paint on top of it, put it back in the shot once more, and it’s done. Not standard procedure at all. It’s all about looking for those opportunities.”

Looking for the Next Challenge

The Creator’s unconventional production methods were successful not only in terms of the efficiency of its budget and resources, but in the ability of the artists on every level to make genuine contributions to the story. That came from Edwards’ example and leadership. “Everyone wanted to be on this project to the point where someone would roll off the show and keep asking if they could do one more thing on a shot, just to make it a little better,” says Chan. “Sometimes you can feel like a cog in a machine, just pushing buttons, but this was the opposite. Everyone on every level felt that they could be creative and suggest ideas.”

ILM was established to create solutions that respect the integrity of a filmmaker’s original vision. For an artist like Comley, the willingness of the filmmaker to include ILM in that visionary process is much more important than the actual problems to be solved. “One way or another, we can paint out that thing, track that thing, come up with a creative solution,” he notes. “Throw us anything you have. I’d rather have that and the vision and richness of photography than a clinical greenscreen and a question mark.”

It was a refreshing experience for everyone, but one critically dependent on the filmmaker. “You have to be willing and able to take this gamble, and it’s hard to find both things together,” says Cooper. “There are a lot of filmmakers that are willing but because of the studio constraints around them, they’re not able. And there are others who have the money and space to do it, but don’t necessarily have the amount of knowledge required. So if you can clone Gareth, you’re in a great place! [laughs] I think there will be opportunities to work like this again. Filmmakers will come to us and say, ‘I know what my movie is, I have so many dollars, and we don’t have to hit everything that I want, but I want to hit as many as I can – can we work together?’ As a company, we’d respond well to that.”

Lucas O. Seastrom is a writer and historian at Lucasfilm.

On February 23, 2024 The Academy of Motion Picture Arts and Sciences will recognize 16 technologies for their impact on filmmaking. Two technologies that ILM played a key role in helping to develop will be among those recognized.

SciTech Awards committee chair Barbara Ford Grant said “this year, we honor 16 technologies for their exceptional contributions to how we craft and enhance the movie experience, from the safe execution of on-set special effects to new levels of image presentation fidelity and immersive sound to open frameworks that enable artists to share their digital creations across different software and studios seamlessly.”

Former ILM engineers Christopher Horvath and Joe Ardent are being recognized alongside  Lucas Miller and Steve LaVietes for the Alembic Caching and Interchange system. Alembic began as a collaborative effort between ILM and Sony Pictures Imageworks to solve the problem of algorithms for storing and retrieving baked, time-sampled data enable high-efficiency caching across the digital production pipeline and sharing of scenes between facilities. The two companies would open-source the project and interchange library in 2011. Since then, Alembic has seen widespread adoption by major software vendors and production studios.

ILM’s Dan Bailey joins Jeff Lait, and Nick Avramoussis for the continued evolution and expansion of the feature set of OpenVDB. Core engineering developments contributed by OpenVDB’s open-source community have led to its ongoing success as an enabling platform for representing and manipulating volumetric data for natural phenomena. These additions have helped solidify OpenVDB as an industry standard that drives continued innovation in visual effects.

Unlike other Academy Awards® to be presented this year, achievements receiving Scientific and Technical Awards need not have been developed and introduced during a specified period.  Instead, the achievements must demonstrate a proven record of contributing significant value to the process of making motion pictures.

Before this announcement, 34 ILM technological achievements had been recognized with Scientific and Technical Achievement Awards. This latest recognition continues a legacy of technical innovation dating back to the mid-1970s.

Earlier today BAFTA announced the nominations for the 2024 EE BAFTA Film Awards, celebrating the very best in film over the past year. ILM contributed to four of the five films recognized with a nomination in the Special Visual Effects category. 

Jay Cooper, Charmaine Chan, Ian Comley, and Jonathan Bullock were each nominated for Gareth Edward’s sci-fi thriller The Creator while Alex Wuttke, Simone Coco, Jeff Sutherland, and Neil Corbould received nominations for Mission: Impossible – Dead Reckoning Part One. ILM also contributed effects work to Ridley Scott’s historical epic Napoleon and James Gunn’s Guardians of the Galaxy Vol. 3. 

Anna Higgs, Chair of BAFTA Film Committee said, “It has been an outstanding year for filmmaking as represented by the 38 films nominated today. They showcase ambitious, creative, and hugely impressive voices from independent British debuts to global blockbusters.  From complex moral issues through to joyful journeys of self-discovery, they all ultimately explore human connection.  Which is why we go to the cinema: to be transported into new worlds, to laugh, cry, to be entertained and to be challenged.  The films nominated today deliver all that and more – we hope people up and down the country, and around the world, are inspired to watch them. Congratulations to all the nominees.”

The winners will be announced on 18 February from the Southbank Centre’s Royal Festival Hall in London, as part of an unmissable celebration of film hosted by David Tennant.     

The EE BAFTA Film Awards will be broadcast on BBC One and iPlayer in the UK, on BritBox International in the USA, Australia, Canada, Denmark, Finland, Norway, Sweden and South Africa, as well as BBC Australia in Australia and New Zealand, NOVA Bulgaria, NOVA Greece, Turner Spain, and Canal Plus. With more territories to be confirmed.

The EE BAFTA Film Awards voting takes place over three rounds: Longlisting, Nominations, and Winners, by BAFTA’s global voting membership, comprising over 7,800 creatives and film industry practitioners

Gareth Edwards’ The Creator leads the feature film field with seven nominations six of which are for ILM work.

In all, ILM visual effects artistry was recognized with 19 nominations including those for The Creator. Indiana Jones and the Dial of Destiny, and Dungeons & Dragons: Honor Among Thieves joining it in the top category, outstanding Visual Effects in a Photoreal Feature. Napoleon and Killers of the Flower Moon were each nominated for Outstanding Supporting Visual Effects, and Ahsoka and The Mandalorian were each nominated for Outstanding Visual Effects in a Photoreal Episode. Darren Aronofsky’s Postcard from Earth received a nomination for Outstanding Visual Effects in a Special Venue Project while Willow, Indiana Jones and the Dial of Destiny, The Creator, Napoleon, Loki, and The Mandalorian, also received craft category nominations.

“We are seeing best-in-class work that elevates the art of storytelling and exemplifies the spirit of innovation. The VES Awards is the only venue that showcases and honors these outstanding artists across a wide range of disciplines, and we are extremely proud of our nominees,” said VES chair Kim Davidson.

The VES is a global honorary society dedicated to “advancing the arts, sciences and applications of visual effects and to upholding the highest standards and procedures for the visual effects profession.”

Awards will be presented at the 22nd Annual VES Awards on Feb. 21 at The Beverly Hilton Hotel in Los Angeles. 

ASIFA-Hollywood announced nominations today for its 51st Annie Awards™ recognizing the year’s best in the field of animation. The ILM team which included Rick O’Connor, Mike Beaulieu, Stewart Alves, Kevin Reuter, and Wai Kit Wan received a nomination for Best Character Animation – Live Action for its work on Lucasfilm’s hit Disney+ series, Ahsoka.

The Annie Awards™ cover 36 categories and include Best Animated Feature, Best Animated Feature-Independent, Special Productions, Sponsored Films, Short Subjects, Student Films, and Outstanding Individual Achievements, as well as the honorary Juried Awards. Created in 1972 by veteran voice talent June Foray, the Annie Awards™ have grown in scope and stature for five decades.

The awards will be presented on Saturday, February 17, 2024 at UCLA’s Royce Hall.

ASIFA-Hollywood is the world’s first and foremost professional organization dedicated to promoting the Art of Animation and celebrating the people who create it. Today, ASIFA-Hollywood, the largest chapter of the international organization ASIFA, supports a wide range of animation activities and preservation efforts through its membership. Current initiatives include the Animation Archive, Animation Aid Foundation, animated film preservation, open-source software support, special events, classes, and screenings.              

The end is only the beginning. KISS have been immortalized and reborn as avatars to rock forever. Created by Industrial Light & Magic (ILM) in collaboration with the band and Pophouse Entertainment Group, the avatars portray each of the four band members in an idealized, and at times superhuman form. Months before the supergroup’s final show which would take place on December 2, 2023, KISS joined ILM’s visual effects team at its San Francisco headquarters to get measured, scanned, and photographed before slipping into sleek motion capture suits so the crew could record every nuance of their final performance. ILM’s StageCraft virtual production team would then simultaneously capture each band member’s performance from their facial expressions to their fingertips as they played in “God Gave Rock ’N’ Roll to You II” in unison.

The KISS Avatars

The KISS avatars showcase ILM’s unique creative expertise and artistry using their advanced performance-capture technology. The team was led by Academy Award® nominated Visual Effects Supervisor Grady Cofer. Cofer has over 20 years of experience supervising groundbreaking visual effects projects. Cofer is currently nominated for an Emmy Award for Outstanding Special Visual Effects in a Season for his work on The Mandalorian. Prior, he served as Overall Visual Effects Supervisor on Space Jam: A New Legacy and earned an Academy and BAFTA nomination for his contributions to Steven Spielberg’s Ready Player One. Cofer’s three-year collaboration with Spielberg utilized cutting-edge virtual production tools to bring the OASIS, the project’s vast virtual world, to the big screen.  

“This is the sneak peek as the band crosses over from the physical world to the digital. We want to give fans a sense of the many forms this band could take in the future.”

Grady Cofer, ILM visual effects supervisor

Cofer’s ILM team leveraged the company’s decades-long experience to push the capabilities of performance capture, gathering every nuance of KISS band members’ face and body performance in exacting detail. This data would in turn become the basis for the motion of the band’s virtual avatars. The raw facial capture data was processed in real-time via ILM’s advanced machine learning algorithms for instantaneous feedback on stage and later passed through the ILM pipeline to be augmented by the artists to ensure the resulting performances were exactly as the band intended for their new digital personas enabling KISS’s creative output to continue to enthrall audiences well into the future.  

As the band’s final concert drew to a close, lead singer Paul Stanley’s avatar proudly exclaimed “KISS Army, your love, your power has made us immortal! A new KISS era starts now.” The digital group then performed its hit single “God Gave Rock ’N’ Roll to You II” to the delight of the concertgoers who filled the sold-out Madison Square Garden.

We are proud to announce Guardians of the Galaxy: Cosmic Rewind will be honored with a prestigious Thea Award for Outstanding Achievement – Attraction in 2024 by the Themed Entertainment Association (TEA). Internationally recognized, the Thea Awards acknowledge exceptional achievements in the themed entertainment industry and celebrate the creative teams who bring immersive experiences to life.

Under the guidance of Walt Disney Imagineering, Industrial Light & Magic created the immersive visuals that guests are treated to as they experience the attraction. Being Disney’s first Omnicoaster ride system, Cosmic Rewind keeps guests immersed in the action as the vehicles make controlled rotations. “It’s always exciting to push the bounds of storytelling and technology and that’s what both ILM and Imagineering are known for,” said Jeanie King, VP, Production at ILM, adding “We are thrilled to continue our amazing partnership with Imagineering that began back in the1980s and continues to flourish today.”

The filmmaker and Lucasfilm legend talks to ILM.com to reflect on what drew him to tell the story of the hit Disney+ series, “Light & Magic”.

Screenwriter and director Lawrence Kasdan.

How did you get involved with Light & Magic?
Several years ago my wife and I made a short documentary about a little diner that we used to eat at all the time that suddenly closed. In making that documentary with her, and cutting it with terrific people, it made me realize how much I liked the documentary format. I had never done that. We set out to meet some documentary people and I met Justin Wilkes at Imagine Entertainment. He asked me what I was interested in doing and I suggested a history of visual effects, because even though I had been around visual effects throughout my career, it occurred to me that I didn’t know much about them. The second thing that interested me were the people of Industrial Light & Magic that I had been working around for over forty years. So we both agreed that that would be a great story to tell: the history of visual effects, and the personal stories of these people. What drove these people, what was their life like, what made them want to stay at ILM as long as they did? Everyone loved the idea, so we went to work.

Lawrence Kasdan, center, on the set of Star Wars: The Empire Strikes Back.

What was your vision for the documentary?
From my very first film until today, I’ve always considered myself a humanist filmmaker. I’m interested in what happens between people, and why people make certain decisions in their lives. What chance is involved? What fate? What luck? So from the very beginning of this I was interested in learning what brought these people to this work. What were the relationships that they made when they arrived? Why did they continue to work there much longer than they expected, some for nearly half a century? What has all that meant to these amazing advancements in technology? It’s about people, and their gifts, and out of those gifts came technological advancements that boggle the mind.

Dennis Muren, left, and Phil Tippett, right, review images with Joe Johnston.

Why did you think this story should be told?
Because it’s great to see artists at work. The commitment of great craftsmen. I love to see people that have mastered a skill, and try to make it better, and don’t settle. I think it’s great to see expertise and this pure devotion to discipline, and that is always a good story to see. Dennis Muren, left, and Phil Tippett, right, review images with Joe Johnston.

John Dykstra and a fleet of miniature TIE, X-wing, and Y-wing starfighters.

How did you approach the research, and what resources did you use?
We had a fabulous team that Imagine Documentaries put together, some internal to the company, and some that were freelancers. They really knew their stuff, so it was a great luxury for me as a director. There were so many things that I wanted to ask during interviews, but the input from this incredible group of producers and writers and editors stimulated me all of the time to go in different directions during interviews.

ILM’s Paul Huston and Larry Tan on the set of Star Wars: Return of the Jedi.

For those that have yet to watch it, can you tell readers what the timeline of the series is?
Over the six hours we see the very birth of ILM, what happened as it came together during the production of Star Wars: A New Hope, and then off of the success of that film, how it was launched into a nearly fifty-year enterprise. We mainly follow it chronologically, but we do jump around a bit to serve the story. Part of the kick for me was that we had such a trove of archival footage, so these people might be talking about something from forty or fifty years ago, and we had stills from that moment in their career. It was incredible to be able to cut from one to the other across time, to hear them talking about a problem, and then see footage of them finding a solution. A huge part of ILM’s legacy is finding solutions to problems.

Peter Kuran, Rose Duignan, and George Lucas review effects shots for Star Wars: A New Hope.

How did you select the filmmakers that were featured in the documentary?
They are all giants, and they have all used ILM in the most expressive and innovative ways. They put pressure on themselves and then turned to ILM and said, “can you do this? Can you create something for me that I have never seen before?” ILM would always say yes. And sometimes it might be a struggle, and sometimes it might be a long process, and sometimes it might be an instantaneous solution where one of these genius people that work there would say, “I know what we could do”. These are major filmmakers that have contributed to the zeitgeist. Jim Cameron, Steven Spielberg, Bob Zemeckis, J.J. Abrams, and at the heart of it, of course, is George Lucas.

Lawrence Kasdan and J.J. Abrams on the set of Star Wars: The Force Awakens.

What was the most interesting thing you learned throughout the process of creating Light & Magic?
I think I learned what goes into creating something new, working with people you respect and depend on, and how this personal relationship then impacts the professional work. There is something beautiful about the generosity of the people that work at ILM, and through that generosity they are able to discover new frontiers and break new grounds that no one has ever been able to do.

All episodes of Light & Magic are streaming now on Disney+.

ILM | A legacy of innovative and iconic storytelling.