Spoiler Alert! ILM VFX Supervisor Kim Libreri Spills the Film's Secrets

An enviable group of award-winning visual effects supervisors powered Super 8‘s post-production. At Industrial Light & Magic, Kim Libreri (Oscar nomination, Poseidon; Academy Sci-Tech award) Russell Earl (two Oscar nominations, for Star Trek and Transformers) and Dennis Muren (six Oscars, 13 Oscar nominations, two special achievement awards from the Academy, and one Sci-Tech award). At Scanline, it was Stephan Trojansky (Oscar nomination for Hereafter). We talked with Kim Libreri about VFX production for what could be this summer’s favorite family film.

But beware … SPOILERS follow!

In Super 8, a group of friends making an amateur zombie film find themselves in the middle of a sci-fi monster movie adventure starring, in addition to the young cast, spectacular and finely integrated visual effects. Writer-director J.J. Abrams set Super 8 in 1979, the year he turned 13 and the year Ridley Scott released Alien. It was two years after Super 8 producer Steven Spielberg’s Close Encounters of the Third Kind and George Lucas’s first Star Wars. 1978 saw the classic zombie movie Dawn of the Dead released. Thinking of this movie as a multi-layered homage filled with easter eggs is an understatement.

“When we were filming it, I’d see these plastic models in the kids’ rooms that I made when I was a kid,” says Kim Libreri, visual effects supervisor at Industrial Light & Magic. “In 1979, both J.J. and I were the same age as the kids are in the movie. And the funny thing is that [Super 8 cinematographer] Larry Fong and J.J. used to make Super-8 movies when they were kids.”

ILM, which had created effects for Abrams’ Star Trek, became involved early, helping the director create a teaser trailer about 14 months before the film’s release. “Shortly after the completion of the teaser, the script was ready and we started shooting the real movie in late September 2010,” Libreri says. “We shot for three months, until Christmas [2010], and then post pretty much happened over a five-month period. Incredibly quick. J.J. is very visual-effects savvy; he had planned the whole movie to work within the constraints of a super-short schedule. We supervised all the visual effects and designed the heavy scenes here at ILM. Then we completed some of the work with help from a group of partner companies. Scanline helped with the train crash and Pixomondo worked with us on the bus attack sequence.”

In an early scene, Joe Lamb and his filmmaking friend Charles sneak out with a few other kids in the middle of the night to shoot a scene for a zombie movie at a tiny, abandoned train station. Charles is excited to see a train on the horizon, hoping it will add production value to his scene. He shoots as the train rolls past. Abrams’ camera, however, follows Joe’s eyes as he watches the train move on. We see a white pick-up truck drive onto the track, straight at the train. In the massive crash that ensues, Charles drops the camera, which keeps filming as the kids barely escape.

“Initially, I was going to deal with the train crash and the finale,” Libreri says, “and [ILM vfx supervisor] Russell Earl, the creature stuff. But in the last couple months, Dennis [Muren] came in to help us find that classic 1979 visual-effects look. Obviously, we didn’t have real trains crashing. Every time you see a train carriage moving, it’s computer-generated. And the film definitely has J.J.’s style. But we tried to pay homage to the Spielberg look of the classics like Close Encounters and E.T. We referenced classic miniature photography of the period to see how the lighting and lens artifacts such as flares and highlight glows used to look. It wasn’t apples-to-apples. It was more trying to capture what you remember it looking like. We wanted that vibe.”

In addition, Abrams and the effects crews often used practical effects to help the young actors deliver authentic performances. “The children weren’t super-experienced, so J.J. would shoot with three cameras at once and let off a lot of practical bangs and flash bangs – the more the better – so he would get a believable feel to the sequences. We would insert CG to tie in with the practical effects.”

Train Crash

The crew filmed the train sequence at Firestone Ranch in Agua Dulce near Los Angeles, an open area surrounded by dry California hills. The production designer brought in grass seeds and watered a 600-by-600-foot-square area to emulate the grassy look of an Ohio landscape in summer. Then, the production crew built a length of train track and a train station. “When the kids prepare to shoot their movie, that’s mostly in camera,” Libreri says. “A year ago, for the teaser trailer, the pick-up truck and train tracks were practical. But in the movie, the pick-up and the train are all CG, and when you look off into infinity, the background is digimatte [paintings and photographs projected onto simple 3D geometry].”

That was phase one. In the second phase of the train-crash sequence, a train car crashes through the station. “The train-station destruction is practical,” Libreri says. “Our special effects team led by Steve Riley rigged a green sled on rails to explode through the station.” In post, effects artists added a computer-generated train carriage to replace the green sled.

The train has 50 carriages that jackknife off the tracks, fly into the air, and create neverending explosions. “The train crash goes on forever,” Libreri says. “Scanline created a majority of these shots. Every time you see a train carriage moving, it’s computer generated, and one of the challenges was animating 65 feet of train with pure animation. We roughed out the sequence with simulation and then augmented it with keyframe animation to get a signature movie look, which is bigger and more exciting than it would be in the real world.” Keyframe animation moved hero elements that had to play to camera; simulation handled secondary debris and shrapnel.

In phase three, the after-crash, the art department dressed the area with debris, train carriages, shrapnel knocked off trees, and so forth. The special-effects team rigged explosions, poppers, and sparks. “J.J. had the kids safely run their paths through the crash site whilst the pyrotechnics were being triggered,” Libreri says. “The explosions terrified the kids (in a fun way) and we got some great performances.”

Then, Abrams picked his angles and assembled the cuts, and the effects crews replaced static elements with more exciting CG mayhem, rotoscoping the kids out of the dark environment as needed. “Two-thirds of the explosions and smoke tied to a train carriage exploding are CG, created at Scanline,” Libreri says. “The rest are tied to practical effects in the plate or practical elements shot at ILM.”

The movie projector presented a VFX challenge

First Reveal

The first time we see the creature, it climbs onto a train carriage in the footage Charles’ abandoned Super-8 camera shot during the train crash melee. The creature doesn’t fully appear until the end of the film, a decision made to emulate the way filmmakers used animatronics sparingly three decades ago. “When he does appear, he looks more real than an animatronic, but we get that same vibe,” Libreri says. The creature never talks, but in a scene near the end, it delivers a modern-day emotional performance.

The tricky part of this scene, though, is that as the kids watch the footage, they walk between the projector and the screen. Distorted images play across their faces as they move, an effect created at ILM. “As soon as they walk into the projector’s beam, the shot becomes really complicated,” Libreri says. “Way more complicated than you might think.”

When Abrams and Libreri considered how to shoot the scene, they knew the creature wouldn’t be finished in time, and it would be many months before the train crash was done. Since they wouldn’t have any footage to run in the projector, they couldn’t capture the scene in camera. “J.J. asked, ‘How the hell can we do this?’,” Libreri says. “So, I came up with a technique.”

Before introducing the idea to Abrams, Libreri tested it using a still camera at ILM. “We know how light interacts with objects,” he says. “What it does. I knew that white light multiplied by the color of the light from the projector would produce the correct result for light being projected on a surface.”

Libreri had electronics guru Steve Switaj replace the tungsten lamp in the Super 8 projector that the kids would use on set with a bright LED light source that could switch on and off. “Instead of shooting the sequence at 24 fps, we shot it at 48 fps,” he says. “On alternating frames – the even frames – the projection LED light would go on, the screen would be illuminated, and if the kids walked into the beam, they would be illuminated.” On odd frames, the LED light was off and only ambient light lit the room.

On the projection screen, the crew put a green card that the compositing team would replace with a projected image in post. Then, they filmed the sequence using an extra pair of witness cameras. Match modelers would later use footage from these cameras to create 3D models of the kids for per-frame animation.

“Everyone on set said, ‘We don’t understand’,” Libreri says. “I’d say, ‘Trust me. It will absolutely work.”

In post, the visual-effects crew split the 48 fps sequence into two 24 fps sequences, one with the projector on (the LED light), and the other with the projector off (only ambient light). “That made it possible to subtract one image from the other mathematically,” Libreri says. “And that gave us the light coming from the projector alone.”

“Imagine looking at a kid standing in front of the projector,” he continues. “In one frame the light is ambient. In the other, the projected light is on. If you subtract the ambient from the frame with the projected light, you get pure white light coming from the projector.”

Then, match-imation animators replaced the children in the footage with grayscale 3D models. “The model didn’t need texture because we projected the image from the fake footage onto the models,” Libreri. “We’d render the image projected onto the 3D model from the point of view of the camera shooting the kids.” The features on the faces of the models, which matched the actors’s faces, distorted the projected image.

“So then, we take the white light hitting the kids from the projector and multiply it by the distorted image on the kids’ faces,” Libreri says. “We know that light is an additive process in the real world. The base color is the white light. The color comes from the CG projection. It’s a really neat idea that, although super-complex to explain, produced seamless results in the final movie.”

VFX played a large role in the film's climax

Spaceship

The second complicated sequence that Libreri singles out happens during the finale, in which the creature builds a giant spaceship. In an earlier scene set in “Area 51,” we see scientists crowded around a ship that disintegrates into tiny, intricate cubes. “We did that simulation in PhysBAM with the help of Ron Fedkiw and Mike Lentine at Stanford,” Libreri says.

The alien’s giant spaceship is formed from millions of cubes that fly through the sky and land on the rough surface of the town’s old water tower, which has become encrusted with bits of old TV sets, cars, pieces of cars and other stuff. All these cubes and pieces blend into a smooth-surfaced, fully-formed ship. As the creature enters the spaceship, the machine powers up and prepares for launch.

“Bruce Holcomb led the modeling team that built the structure,” Libreri says. “They created all these components ‘ the thruster unit, wings, doors, cockpit, all the mechanical components to create a fully formed ship. Then, they disassembled it in reverse order so we ended up with floating components. Imagine the ship pulling apart in zero gravity.”

Using shattering algorithms, effects artists broke each of those pieces into the two-inch cubes. “Depending on the complexity of the shot, some cubes could exhibit dynamics and hit each other,” Libreri says. “In most shots they are treated as particles But they aren’t single dots in space. They have rotation and motion. We simulated them moving backwards through space into the containers that they came from, and then reversed the whole thing. In the finale, the cubes sort of flock together. Some are on a crazy, jittery path. Some have a fluid path. We used procedures in Zeno [ILM’s proprietary pipeline software] to animate the distant stuff randomly based on size and distance with noise on top so they’d look agitated. Hilmar Koch led that effort, and Paul Kavanagh’s team animated the pieces close to camera. It was an incredibly complicated orchestration.”

In addition, modelers created all the debris that flew onto the space ship as it formed. “We keyframed anything close to camera because J.J. wanted it to feel like the creature selected the things he needed and discarded pieces he did not,” Libreri says. “Sometimes a thing would drop down because he didn’t need that piece.”

The spaceship builds itself on top of a water tower in a filmed location, a large environment with lighting that varied from one end of the street to the other. “We shot light probes, HDRI, for the street lighting but because of the dynamic nature of the set lights they wouldn’t work out of the box,” Libreri says. “We had to decide on a shot-by-shot basis where to place all these animating lights.” By the time the lighting team added in the global illumination and ray-tracing on a massive amount of geometry that included the ship, they ended up with shots that took up to 24 hours to render on 12-core machines.

“We have tons of CG in the final result, but this isn’t one of those movies where we’re trying to break the sound barrier on visual effects,” Libreri says. “It’s all in support of the story. It was a lovely movie that was driven the story of its characters. We tried not to get in the way. The kids’ performances are the stars of the show.”