4K Was Just Too Small for Those Large-Gauge IMAX Sequences

Creating seamless, realistic digital effects for director Chris Nolan is always a tall order, but doing so in IMAX resolution was an extra added attraction for the post-production studios who worked on Warner Bros.’ box office hit The Dark Knight. “Chris [Nolan] has a very astute eye,” says Nick Davis, the film’s visual effects supervisor. “It was important for the visual effects to look as though they were shot on set. He didn’t want the film to take on a gimmicky visual-effects look.” In keeping with that philosophy, Nolan shot footage for the major visual-effects sequences with IMAX cameras, making The Dark Knight the first feature film shown in IMAX that wasn’t originally shot in 35mm.
“By shooting in IMAX format we could fill the whole frame, and the images are sharper and have more detail,” Davis says. “But the vendors had to work in much larger resolution and, logistically, it was a difficult movie to finish.”

Visual-effects post-production houses Double Negative, Framestore CFC, BUF, and Cinesite worked on the film’s roughly 700 visual-effects shots. Framestore CFC’s 200-plus shots centered on a major IMAX sequence set amidst the tall buildings in Hong Kong, and on digital makeup for Harvey Dent, who appeared in 112 shots as Batman’s nemesis Two-Face, none of which were IMAX shots. BUF created the opener in IMAX resolution and Batman’s “Sonar Vision,” which appears primarily during the Prewitt Building sequence. Cinesite helped with rig removal, wire removal and 2D cleanup for the complicated mechanical effects.

The lion’s share of the effects, including all the sequences in Gotham City, went to Double Negative. Of their 370 shots, 170 take place during three major IMAX sequences: The opening bank heist, the extravagant armored-car chase, and a hostage rescue in the Prewitt Building. In addition, Double Negative worked on a few IMAX shots during the Gotham City evacuation. “The visual effects go into the cracks, forming backgrounds and foregrounds,” says Paul Franklin, visual effects supervisor at Double Negative. “They’re always there to support the images in a scene. Never the reason to do the scene.”

To support the IMAX scenes, the studios could not work in full IMAX resolution, which is theoretically 18K; instead, the target resolution was approximately 8K, the maximum resolution for scanned film. Even that was difficult. “A single 8K frame requires 200 MB of data,” Franklin says. “So we had to upgrade our whole infrastructure. We needed faster network speeds to move data around, massively beefed up servers, and – the most important thing – a new compositing solution.”

Shake Up

The studio had upgraded its machines to 64-bit operating systems about a year ago, but Double Negative’s compositing software of choice, Apple’s Shake, still operated in 32-bit memory space. “Shake falls over with images much more than 3K resolution,” Franklin says. “Fortunately, we have a strong R&D team.”

Because the studio had Shake-experienced artists on the crew and existing Shake-based software and workflows, the R&D department created an additional application that worked in conjunction with Shake to get around the memory limitations. “It’s a separate, standalone set of tools, not a plug-in,” Franklin says. “A kind of overlay. Users could transition to the new software easily.”

New blade servers increased data-storage space and improved data transfer rates. “In 1999, when we worked on Pitch Black [released in 2000], we needed to access 2 TB of data,” Franklin says. “This show used over 100 TB of data.”

Additional acreage in the render farm amped up the processing power. “For Batman Begins, we had 200 RenderMan licenses, each of which ran on dual-processor machines,” Franklin says. “For this film, we beefed up to 900 licenses, each running on four processors – 3600 processors. The limiting factor was how much power we could get into the building.” The new processors’ lower power consumption requirements released enough of the energy budget to accommodate the additional servers. However, a larger crew and more artist workstations still tipped the studio toward a power wall. “We tripped the fuses twice, but there were no other outages.”

With the new infrastructure, the 64-bit compositing solution, and efficient load distribution techniques for the 3D pipeline in place, the crew wrangled a small number of shots for the opening bank-heist sequence, which they delivered in October 2007, six months ahead of the rest of film. Warner Bros. released that sequence as a prologue in front of IMAX screenings of I Am Legend.

“It was our test sequence for the pipeline,” Franklin says. “We did those shots at 8K. After that, we went down to 5.6K except for high-contrast shots where the camera lingers on vehicles and architecture.” The lower resolution reduced the data requirements to approximately 160 MB per frame.

As for the content in the shots, to support high-res IMAX close-ups of Gotham City, Double Negative artists overhauled all the buildings that they picked up from Batman Begins, adding geometric detail and repainting all the textures. In addition, the studio modeled the Prewitt Building from scratch for IMAX shots using reference filmed on location in Trump International Hotel and Tower, which was under construction in Chicago.

“We had to create more environments from scratch and had more matte paintings than in Batman Begins,” Franklin says. “We also built the digital Batpod and Batmobile, and all those assets had to hold up in full-frame, harsh lighting conditions. Things we could get away with at 2K wouldn’t cut the mustard on Dark Knight.” In addition to the higher-resolution format pushing the need for detail, the camera moved closer to the digital architecture, the digital vehicles and the digital stunt doubles.

The Batpod

Ground Work

“I think Chris [Nolan] had greater confidence this time,” Franklin says. “For example, they attempted to do the shot of the Batmobile turning into the Batpod as a physical effect with rigs. But Chris wasn’t getting the speed and energy he wanted, so we did a digital version.” In shots with the Batpod racing straight ahead, the bike and rider are real; when Batman turns the bike, lays it down flat, or climbs a wall with it, both are digital, as is ( sometimes) the road beneath the bike’s wheels.

“We tracked the Batpod into live-action plates filmed in Chicago, but in some cases we needed a digital road for new camera moves,” Franklin says. “The only way was to photograph the road in excruciating detail. We recreated a quarter mile of LaSalle Street and 250 yards of Lower Wacker Drive to one-centimeter resolution.”

For tracking, the studio uses boujou, PF Track, 3D Equalizer and a proprietary photogrammetry toolset. “We can give it point tracks from any source and it will give us a solve for the camera,” Franklin says of the proprietary software. To match the IMAX camera lenses, Double Negative primarily characterized Zeiss lenses from Hassleblad cameras. “The difference is that the IMAX camera uses proprietary mounts to fit the lens to the camera body,” Franklin says. “We shot lens-specific distortion grids, but did all the other characterizations with the still lenses.”

To create the road, they photographed the surface at 5K resolution with Canon EOS-1DS Mark II still cameras and LIDAR-scanned the surface to produce a terrain map. “With the 1DS Mark II, we could shoot many times faster than before,” Franklin says. “We’d wait for a gap in traffic or have the police stop traffic for 10 minutes. For Batman Begins, we had to wait for nighttime. It was the same for photographing the buildings. We could wait for the light to be flat rather than wait for the buffer to empty.”

Matte painters worked in 8K resolution, and the artists painted texture maps in either 8K or 16K resolution, depending on the view. “That was a bottleneck,” Franklin says. “Photoshop doesn’t handle images above 4K very efficiently and it’s a closed tool, so we couldn’t get in there and add stuff to it. Working with Photoshop was possible, but slow. It took three or four times longer than usual to paint the textures.”

Rendering was less of a problem. To optimize render times, the studio rewrote shaders and created a new tool called Spangle to preview RenderMan renders in realtime. “It renders on NVIDIA cards,” Franklin says. “It doesn’t impact the render farm.” In addition, a 3D relighting system allowed compositors to change the direction of lights, if needed, without going back into the 3D pipeline.

What You See (Not)

A second big bottleneck, though, was in viewing the images. “Our biggest monitors are 2K,” Franklin says. “You can’t realistically buy a 5.6 x 3.6K monitor, and the highest-resolution digital projector is about 4K.” So the studio wrote a set of tools that extracted 2K tiles from the images for the artists to view, but for dailies, they sent files from their London-based studio to DKP 70mm, the IMAX-subsidiary post facility in Los Angeles, for recording onto film stock.

“It was a minimum of 10 days before we saw the shots back in the UK,” Franklin says. What’s more, to view the shots, they had to book time at London’s only IMAX theater. “And you can’t rock and roll on IMAX projectors,” he adds. “You have to rewind and go again, so we had only a couple chances to view the output.”

As a result, instead of doing IMAX dailies, they did tests to establish the end result and made assumptions based on those tests.

“It was like 10 or 15 years ago, when it was difficult to view 2K frames on digital monitors,” Franklin says. “But this is what 3D artists in particular have always done. We’re used to looking at low res images during the process.”

Framestore CFC VFX supervisor Tim Webber echoes Franklin’s analysis. “Most of our problems came from the process,” he says. “We expected it to be a little harder, but it was way harder. These days, we work with people all over the world, send images digitally and get feedback. But with physical film, there’s a big delay. And matching the color space was harder because they didn’t do a digital grade, so we had to match the IMAX output recorders. They still use CRT recorders, which have a different color response than the laser output recorders most people use today, so we had to put work into that.”

Rather than DI, director Nolan and DP Wally Pfister used their preferred photochemical process. “Wally would go into the lab and call the lights and they’d print it,” says Franklin. “There was no real-time, hands-on grading.”

Scoping it Out

In addition to the IMAX shots, the studios had to produce versions of their shots for anamorphic 35mm prints, as well. [See “Marrying IMAX and 35mm in The Dark Knight]. For the all-digital sonar-vision shots, BUF created separate IMAX and scope images. “Chris wanted to different shot values,” says Alain Lalanne, visual effects producer. “They tell the same story, but because of the size of the images in IMAX, we had to do things differently. So we had to find technical ways to make all the versions and compute them. It took a lot of organization.”

Double Negative and Framestore CFC, however, could lift scope images from the IMAX footage. “We worked at IMAX resolution all the way through and then did a 2:40 extraction,” Franklin says, “taking the cinemascope slug out of the larger picture, and then downresing to 4K. We initially started working in 2K for cinemascope, but decided that since we had the resources to do IMAX at 5.6K, we’d do the scope at 4K.”

Even though editorial provided extraction guides, cutting the correct shot out of the IMAX scans was tricky. “The telecine operators set up the racking for the color cine transfers to give editorial a safe image for projection, but we got a scan of the whole negative, so there was a discrepancy,” Franklin says. “Tiny things like that wouldn’t normally make much difference in 2K resolution, but everything in IMAX gets magnified.”

Franklin believes the lessons learned on this film will help the studio in the future – even if it never again works in 5.6 or 8K resolution. “Filmmakers increasingly want us to work in 4K, and that used to be a big deal,” he says, and then laughs. “After working on 5.6 and 8 resolution, 4K is proxy resolution.”