Shooting the New Nissan Maxima on the Red, at 4K - and In a Bubble

When M2G Media in Irvine, CA, was hired by a global marketing firm to create a video piece introducing the new design for the 2009 Nissan Maxima at the New York Auto Show, M2G producer Jeff Granbery saw Red. It was the Red camera system shooting 4K imagery, he decided, that would give M2G the best shot at delivering a compelling, picture-perfect video on a tight deadline. The Red-centric workflow brought advantages to the shoot that went beyond the obvious resolution win – the extra pixels were crucial to the project’s success, but having RedCine on set meant everyone’s confidence level went up during production.

Watch the video and a behind-the-scenes presentation provided to us by M2G Media, below, then read about how they did it.

“Our number-one problem was that we had seven and a half weeks to deliver this, and that included shooting,” Granbery tells F&V. “We had to come up with a list of tools very quickly. If we’ve only got one shot at this, what’s going to give us the best ability to get through this?”

The project, shot and edited at 23.98 fps and helmed by director/DP Craig Barker, illustrated a dual personality – the Maxima is positioned as part sports car, part xury sedan. Granbery knew about a company called Photobubble, which makes what are essentially large, inflatable soft boxes, illuminating a subject with diffused light without distracting background objects or reflections. Granbery asked Photobubble’s Allan Wachs about building one that was half white and half black.

“So we had an L-shaped white and an L-shaped black, which, when you think about it, gave me four studios,” Granbery says. “I have a black cove, a white cove, 200 running feet of white and 200 running feet of black. It’s four stages in one. My entire world was a smooth, uncornered, no-trees, no-buildings, cylindrical unit. If I had done that up in L.A., my costs would have been through the roof. We had a decent budget – but we didn’t have an ad budget.” Set up in a hangar at the old Tustin Air Base in Tustin, CA, the Photobubble also ensured an added layer of security, on the off chance that a photographer was hanging out nearby, hoping to grab an illicit snap of the top-secret design. The smooth, seamless environment reduced touch-up work on the finish project – removing reflections or light casts on the car – by almost 60 percent.

The project was destined for HD exhibition on a 40-foot wide video screen. Because time was of the essence, Granbery knew the 4K footage from the Red camera would offer the most visual flexibility. With the right technical knowhow and a bit of artistry, it would be possible to crop, zoom and pan across areas of the 4K image at HD resolutions, easily creating virtual camera moves that would require lots of time and finesse to set up on location. “Let’s say you only have enough time to shoot the whole dashboard, and then the client says, ‘Boy, I really need a close-up shot of the navigation system,'” Granbery explains. “We could crop in 200 percent and have no [quality] loss, because our actual output is in HD resolution. So that became very attractive.”

Robb Hart of VFX firm An Ideal World in Santa Ana, CA, was on-set to serve as VFX supervisor, but he and co-worker Sharon Diaz were also managing the Red workflow, making sure work proceeded smoothly and efficiently. Hart was sold on using the Red camera from square one. “The idea of being able to shoot 4K for this kind of project was staggering,” he says. “We were shooting from a camera car with a Russian arm beside this Maxima that was shooting around inside the bubble. The shots were meant to be wild and aggressive. And you might get a shot that is beautiful, but really needs to be reframed.”

Hart remembers a shot the director wanted where the camera would move in, close up, to the front of the car, and then pan all the way to the rear wheel as it’s racing past. If it was possible to actually execute that move in camera, it would have burned up time that the production didn’t have the luxury of wasting. “We made that shot out of a wide shot,” Hart says. “We had the whole car, and we were able to punch in to the front wheel, create our pan to the back wheel, and even make it look a little bit more like a real camera by overshooting it a little bit and then coming back. Bringing cinematography into post is radical. And it’s only possible when you’ve got that much extra resolution to play with.”

It was also possible in part because the crew had immediate access to the actual footage from the camera. Flash memory cards from the Red camera were ingested by Diaz, who made two copies of the footage on FireWire drives and was able to apply color-correction looks or rough zooms and pans on the spot using a MacBook Pro loaded with RedCine. She could playback 1K proxy footage or take a look at 4K frames to evaluate focus and crispness of the image. The team used bicycles to carry material around the hangar.

“We actually took the footage from the camera, walked back to Robb and Sharon, put the data in, and did color-correction and post correction on the set in real time,” Granbery says. “Instant client satisfaction. It was amazing. It was like we picked up the lab and put it in a computer.”

The system also immediately revealed any problems with the footage. On the first day, the team realized that the very nice zoom lens it had rented – a favorite on 35mm shoots – suffered from older glass, lacked fluoride coating, and added a tinge of green to the footage. (It was replaced with a newer prime lens, to great effect.) The gaffer (and owner of the Red camera), Rich Schaefer of High Impact Pictures, was able to make lighting decisions based on a precise knowledge of what highlights looked like in the footage. If there was too much frame blur in a given shot, the driver would slow the car down for the next take. And so on.

Hart says that’s a far cry from the norm on previous shoots, where he relied on video taps to evaluate 35mm images, or SD downconversions from HD cameras. “Now that there’s the AJA Io HD, we’ll consider that for HD shoots,” he says. “But I get the feeling that we’re pretty much sold on Red. The cost differential of shooting Red over HD is not that great, especially when you look at the fact that you don’t need to rent a $100,000 HD deck to digitize the stuff. We walk in with a couple of FireWire drives, and we’re ready to get going.”

Of course, you don’t want to shoot with a Red and then make stupid mistakes in post that end up reducing the effective resolution or color fidelity of your footage. Hart maintained a DPX pipeline until it was time to create the final, HD deliverable in Final Cut Pro. “Working in trillions of colors is ironic for a job that was almost entirely in black and white,” he says, “but when you looked at the ruby red of the speedometer in Rec. 709 [color space] you could see how incredible the colors really were. In RGB color space, it was a little disappointing.”

So the offline edit was done using 2K proxies – Hart says he was a little nervous that somebody would see how good the proxy files looked and decide that it wasn’t necessary to go back to 4K for the color grading, so he circulated small QuickTimes for approvals. “Before, you would look at a rough cut or dailies, but there was a nice wow factor when people saw the final,” he says. “With this pipeline, it looks pretty ‘wow’ from the day of the shoot.”

The project was moved back into RedCine using Crimson, a $190 tool expressly created for bringing an XML file from Final Cut into RedCine. The whole project was graded twice – an “extreme” grade that really made the shots pop, and then a more conservative pass at the material. Next, the DPXs were taken into Shake for compositing, where relatively simple matte techniques could be used to combine the two different grades in a single shot. If a shot still didn’t look quite right, it was taken back into RedCine for a third pass. “We didn’t have to persuade anybody to let us go back to telecine, since we had control of it here,” Hart says. “To have no inhibition about being able to move backward and forward in the production pipeline, because you’re not incurring huge costs – to me, that’s the most revolutionary thing. You don’t necessarily move in a linear way. The old Hollywood model is so old-fashioned – you see dailies, you do your edit, and then go back to your selects – I couldn’t believe that was the way people still wanted to work. This blows that away.”

From Shake, HD DPX files were output, using Glue Tools QuickTime components to import them into Final Cut, where the final deliverables were created. For the big screen at the New York Auto Show, the output was in 1920×1080 ProRes HD. But the project was also shown on a 1920×800 screen. “For that one, we output sequential TIFFs, which was an interesting delivery format,” Hart explains. “The ProRes looked great, but we had a Final Cut timeline with the final HD DPX frames, which was as clean as could possibly come from the original source. Whichever way it went, even to SD or small QuickTimes, it looked outstanding.”

The mission was accomplished, Granbery agrees, recalling some feedback he received after an especially high-level screening of the project. “There was an inter-agency meeting where they showed it, almost in its entirety, with Omnicom, Designory – all the big players. And they all said, ‘That matches what we’re doing on the global scale. Congratulations.’

“I got that phone call, and that was a good day. Here we are, a smaller B-to-B-type communications group working on a limited budget for a major player for a trade-show booth, and yet we’re able to output visuals that are agency-comparable at 80 percent less. That was cool. And had it not been for the Red – and our crew – we could not have done it.”


For more information: M2G Media; An Ideal World; Photobubble; Glue Tools; Crimson