2000s

50 Years | 500+ Film and TV credits | 135+ Awards

SINCE 1975

For James Cameron’s 2009 blockbuster Avatar, ILM was brought in late in the postproduction schedule to help complete a few sequences and get the film to the finish line. ILM created the visual effects for many of the specialized vehicles in the film, including the Valkyrie, a large shuttle used to move people and equipment, and several different types of helicopters, as well as the landscapes and environments that those vehicles appeared.

ILM also did the effects work on the film’s final battle scene, taking responsibility for the shots of all the vehicles taking off, as well as the sequence’s cockpit interior shots. The team was also responsible for shots featuring the attack the giant “home tree,” where the Navi, the humanoid alien race in the film, live. Specifically, ILM handled the shots in which the camera looks back toward the aircraft flying and ultimately firing on the tree.

The film would go on to win the Academy Award® for Outstanding Visual Effects.

The sequel to the 2007 blockbuster film, Transformers, Revenge of the Fallen is notable for its massive action set pieces and incredibly complex animated characters.

2005 Academy Award® nominee for Best Visual Effects.

As Earth is invaded by alien tripod fighting machines, one family fights for survival.

2007 Academy Award® nominee for Best Achievement in Visual Effects.

As the lead visual effects house on the film, ILM created over 450 shots of robot mayhem.

As the Clone Wars nears its end, Obi-Wan Kenobi pursues a new threat, while Anakin Skywalker is lured by Chancellor Palpatine into a sinister plot for galactic domination.

2002 Academy Award® Nomination for Best Visual Effects.

The focus of ILM’s work for Pearl Harbor was on the creation of panoramic battle scenes that tied together computer graphics, miniatures, and practical effects with first-unit footage. These scenes were unlike many of the fantasy-oriented projects ILM was used to dealing with because Pearl Harbor called for a seamless blend of visual effects with a familiar historical reality. The resultant scenes contained thousands of elements. The complexity of any one shot was equal to the work that would typically go into 10 isolated event shots.

A “sailor management” tool was created to help insert computer-generated people developed from motion-capture performances. Wherever a sailor was missing in a frame, the VFX artists would use this tool to select an actor and an appropriate uniform to put the CG character right where they needed to be. By the end of the development of this sequence, not even Michael Bay could tell the difference between the real and CG sailors.

Part of recreating such an important historical event is accurately depicting reality. ILM researched the movement of aircraft and ships in reference footage so they could create models of real battleships and CG airplanes from the World War II era. When it came time to film the battle scenes, ILM utilized advances in dynamic simulation techniques to realistically portray airplane crashes and other explosions at sea.

ILM’s Hayden Landis, Ken McGaugh, and Hilmar Koch adapted some concepts originally developed for the studio’s work on Speed 2 known as reflection occlusion and developed a new production technology for Pearl Harbor called ambient occlusion. This lead to the three of them being awarded the Academy Technical Achievement Award at the Academy’s 2010 Sci-Tech Awards. You can read more about the history of the development of ambient occlusion and the technologies that built upon it in the FXGuide article here.

Working on a Steven Spielberg film always carries high expectations, and ILM was up to the task, developing new techinques for Minority Report to enhance the CG elements and environments in the futuristic world of the film.

During the Napoleonic Wars, a brash British captain pushes his ship and crew to their limits in pursuit of a formidable French war vessel around South America.

Much of the movie takes place on the high season, often in stormy weather. ILM had proven its ability to create digital water for The Perfect Storm, but for this film, visual effects supervisor, Stefan Fangmeier, (who also supervised the effects in The Perfect Storm) had another idea. Because he had received real ocean footage shot in extreme conditions, he relied primarily on his compositors. To create the ocean surrounding the naval ships, compositors assembled bits and pieces of real water, using digital water only to glue the pieces together.

A decidedly odd couple with ulterior motives convince Dr. Alan Grant to go to Isla Sorna (the second InGen dinosaur lab.), resulting in an unexpected landing…and unexpected new inhabitants on the island.

Billionaire industrialist and genius inventor, Tony Stark, is kidnapped and forced to build a devastating weapon. Instead, Tony builds a high-tech suit of armor and escapes captivity.

ILM worked on over 400 shots — with many of them dedicated to creating Iron Man’s suit — and along with director, John Favreau, set flawless photorealism as the standard for the film’s visual effects.

Robert Downey Jr. donned ILM’s patented Imocap suit during principal photography, which would allow the visual-effects team to create a suit of digital armor featured in Stark’s mansion and during captivating battle sequences with Iron Monger and digital F-22 fighter jets

Iron Man’s brushed metal Mark II suit and trademark gold and red Mark III suits came together with a marriage of CG animation and technical wizardry that garnered both a BAFTA and Academy Award® nomination.

Indiana Jones teams up with the industrious Mutt Williams to find the Crystal Skull of Akator.

In an effort to keep with the themes and feeling of the original trilogy, ILM worked closely with director Steven Spielberg to capture the essence of the action-adventure genre: stunt work, special effects, and visual effects blended together seamlessly to facilitate the filmmaker’s vision.

The crew combined practical effects gags, digital matte paintings, advanced photomapping, CG animation, and large-scale miniatures to achieve the desired results.

An unusually intense storm pattern catches some commercial fishermen unaware and puts them in mortal danger.

Director Woflgang Petersen’s vision for The Perfect Storm was going to require a seamless integration of complex computer-generated simulations and imagery with blue-screen photography that utilized elaborate practical effects. Because the film’s story is largely based on a real event, Peterson emphasized the importance of creating a highly realistic depiction of a severe storm at sea.

The creation of a storm of this magnitude had never been attempted with computer-generated imagery.

The first film to blend real water with CG water, The Perfect Storm needed a significant amount of work on the placement of a digital ocean behind a gimbaled boat. Many of the other shots relied on complex computer-generated boats, characters, and water, including the simulation of several hundred-foot waves. This work was accomplished with a proprietary plug-in tool used with Maya.

In dailies, the ILM crew would often find themselves saying “this doesn’t look right, but why?”

For the answer to questions like these, ILM almost always goes back to reference material, but there is almost no reference point for a storm like this. So, the ILM R&D team, led by Habib Zargapour, identified the essential visual details that needed to be represented and the techniques that had to be developed to achieve the required realism.

Over a six-month period, new software was written for the water surface itself, as well as for the extremely complex particle simulations that would be used to model elements such as spray, crest mist, crest foam, and splashes. John Anderson developed a basic ocean–simulation software that was imported into a commercial 3-D package via proprietary plug-ins.

With more than 80 basic ocean states, this simulation allowed the animation team to select the ocean conditions, position a specific boat within them, and automatically generate accurate boat motion.

Blacksmith Will Turner teams up with eccentric pirate “Captain” Jack Sparrow to save his love, the governor’s daughter, from Jack’s former pirate allies, who are now undead.

The Curse of the Black Pearl introduced new uses of motion-capture technology to create computer-generated characters.

During various sword fight scenes where actors fight undead pirates, ILM made it possible for Director Gore Verbinski to direct the CG characters’ performances by shooting each scene twice. First they’d shoot a reference take with actors fighting stuntmen standing in for the soon-to-be-skeleton characters, and then they’d do a clean take with the actors fighting no one. They would insert the CG pirates into these clean takes after working with the stuntmen to duplicate the appropriate choreography.

The digital costumes in Curse of the Black Pearl were elaborate and required a lot of work. With 23 people doing nothing but costume simulation, ILM was able to utilize the same-clothing-simulation software to create the realistic looking garb of the undead pirates.

In the past it was extremely difficult to interweave visual effects with the freestyle of handheld camerawork, but by the time Curse of the Black Pearl was produced, match-move tools had evolved to a point where it was no longer necessary to restrict the camera movement for shots requiring visual effects. As a result, ILM “went completely free” on Curse of the Black Pearl, allowing Verbinski to focus more on story. This would put ILM’s team to the test in the scenes when the actors walk into the moonlight and becomes completely CG skeleton characters.

Believably transitioning from real life Geoffrey Rush to CG Geoffrey Rush as he steps into the moonlight revealing his cursed form was one of the most important sequences in the film. ILM accomplished a seamless transition by removing some, but not all of Rush’s features. Specifically, his real eyes remained with the otherwise CG character for just a beat.

Jack Sparrow races to recover the heart of Davy Jones to avoid enslaving his soul to Jones’ service, as other friends and foes seek the heart for their own agenda as well.
After their experience with motion-capture technology in Pirates of the Caribbean: The Curse of the Black Pearl, ILM knew it was time for something newer and better.
In an effort to allow all the performers to be present on set, ILM developed Imocap, a lightweight, low-footprint, robust, and filmmaker-friendly motion-capture system that could be used anywhere. This technology made it possible for actors to perform motion-capture on location during principal photography. As a result, director Gore Verbinski was able to work with his actors (who would later be replaced by CG characters) on set without having to worry about performing those same scenes at a later date on a mocap stage.
The work didn’t stop there, as Davy Jones’ tentacle beard presented another set of challenges — a big fleshy group of appendages that had to perform like a living creature. The creature development artists put in wonderful animation controls that allowed animators to move the tentacles in very specific ways, and a program was written that would drive the individual joints between the segments of the beard with a whole variety of parameters for high level control that would dictate emotional changes.
Behind the scenes, ILM’s revolutionary new Zeno pipeline moved fully into action, giving artists easy access to more tools than ever before.
In the end, digital Davy Jones was a huge breakthrough in the VFX industry and his realistic complexity garnered ILM the 2006 Academy Award® for Best Visual Effects.

Survivors must fight for their lives when the luxury ocean liner Poseidon capsizes after being swamped by a rogue wave.

Size Matters. Director Wolfgang Petersen wanted to film a 200-foot wave capsizing a 1,200-foot cruise ship and aimed to build the shot in a way that had never been done before: he wanted a dynamic destructive wave hitting the boat from many angles. The boat’s size meant that the ILM crew couldn’t use a miniature boat and real water; they had to create the shot digitally.

Agreeing to do two mind-blowing sequences for this film took a lot of confidence, but ILM delivered those shots and more. In the opening sequence, the camera follows an actor running around the deck of the Poseidon for three minutes in bold, broad daylight. Remarkably, the ship is completely computer generated even in close-ups.

Later in the film, ILM tosses the ship inside a massive digital wave thanks to new multi-processor fluid simulation technology developed by ILM’s R & D department (which worked with Stanford University). As the ship shatters inside the giant wave, it interacts with the roiling water, a feat never before accomplished on such a massive scale. In the end, Petersen got the shot he wanted.

Led by VFX legend Dennis Muren and Scott Farrar, ILM sought to create the insidious Rogue City using a combination of practical and CG effects. Utilizing a commercial game engine, ILM developed the first real-time on-set pre-visualization system, allowing director Stephen Spielberg to see an approximation of the CG set in his viewfinder. The result was the expansive and futuristic world envisioned by Spielberg and the late Stanley Kubrick.

To create the half live-action actors/half CG Mecha, ILM used the Motion and Structure Recovery System (M.A.R.S.) to activate a highly automated camera tracking solution which would enable a CG camera to match its live-action counterpart.

The brash James T. Kirk tries to live up to his father’s legacy with Mr. Spock keeping him in check as a vengeful, time-traveling Romulan creates black holes to destroy the Federation one planet at a time.

As always, ILM was sensitive to the director’s particular vision. In this case (keeping the spirit of the original Star Trek series), much of Director J.J. Abrams’ focus was on the story’s characters and realism. In fact, ILM developed animation tools to replicate Abrams’ style of photography on the set. Animators even applied camera shake by use of a small rotational-motion-capture sensor on a tripod at their workstations.

The black-hole sequence was one of several in the film that combined various visual effects techniques: CG space, elements shot on partial sets at Paramount Studios, and extensive digital-set extensions. On top of the imagery of the black hole itself, the ILM team built layer upon layer of detail into the shots, including the Vulcan planetary destruction which required extensive use of ILM’s simulation software.

There’s a huge history to the Star Trek franchise that people are very connected to, and ILM’s team of artists tried to match the style and color palette of the old show for the final shot of the Enterprise flying off into warp.