All About Crank's 4:2:2 DS Nitris Workflow

The evolution of digital moviemaking continued this year with the end-of-summer release of Crank, a sex-drugs-and-violence-pumped thrill ride starring Jason Statham (The Transporter), who is quickly becoming an icon in this sort of physically blunt, no-stuntman B movie. Injected with an exotic Chinese poison, hit man Chev Chelios (Statham) races across Los Angeles in a hunt for his would-be murderers. Writer/director team of Mark Neveldine and Brian Taylor (billed as Neveldine/Taylor) also operated the cameras, opting for a handheld style and giving editor Brian Berdan [IMDb resume] plenty of room to move. Working toward a 4:2:2 Avid DS Nitris conform (executed at LaserPacific) that encouraged experimentation in the creative-editing process, Berdan's witty vision of Chelios's desperate adventure includes cutaways to Google maps, reverse-angle views of subtitles, and even an unprintable expletive briefly plastered across the protagonist's forehead. On a more practical note, he color-tweaked some Canon XL2 footage so it wouldn't look radically different from the rest of the movie's HDCAM footage. We talked to him about workflow, technique, and the continuing evolution of film editing.
FILM & VIDEO: First, because it looks like there was a lot going on in the edit on this film, I wanted to ask you about the line between your job and the realm of visual effects.

BRIAN BERDAN: There was a grey area. We knew we were going to do a lot of split screens. And with an HDCAM-shot show, we knew we were going to finish electronically at 4:2:2. I wanted to work in a way that eventually translated from offline to online. I even hate to use the word online, because it sounds so video. That meant going with the Avid DS Nitris or the Symphony Nitris.

Was Final Cut Pro ever considered?

The directors had cut their own stuff on Final Cut, and they intended to set up Final Cut to do an HD conform and for previews. But I felt there was no big advantage to that, and there were certain things an “official” video house could help with on the high end of the scale, where a conform would work a lot better. There was no doubt that we wanted to keep everything that happened in the offline consistent with the online. There would be no recreating moves and blow-ups. We chose the DS Nitris rather than the Symphony Nitris so we’d have a little more tweaking ability. Everything translated really smoothly from the offline to the online.

So how much of what you were doing effects-wise in the edit was planned in advance?

We didn’t plan ahead of time how to do any of the bells and whistles. In the script, it said “split screen.” Those were laid out in the offline. Moves don’t translate all the time to the DS, so some of those had to be recreated. The Google Maps thing was something we came up with when I was on the Mac and found a beta version of Google Maps. I recorded those off my Mac screen using [Ambrosia Software] Snapz Pro. Those are just pure screen captures at quarter-HD resolution. I recorded them in slow-motion, so I got lots of frames and no stuttering, and then sped them up in the Avid.

It sounds like you had a lot of freedom on this project.

The idea that everything we did in the offline could go into the online seamlessly opened a whole new palette for us. This is the way I’ve always wanted to cut, without worrying about the VFX budget and ordering opticals. I’m used to, “Oh, God, I’ve gotta do a flop here – that’s another $1000.” It was really liberating not to think about that at all.

What, exactly, did you use for your edit?

I was using Media Composer Adrenaline, and also cutting a bit on Avid Xpress Pro.

What about the shots that were heavily color-corrected, or where we see Chev’s environment warping around him?

I used [GenArts] Sapphire plug-ins for a lot of the effects. Some of the footage was shot on the Canon XL2, which is a 24-frame DV camera, and it stood out as lower resolution. On a lot of those, I applied a Sapphire layer look and tweaked the color and contrast to give it some intensity. The shots looked very flat compared to HD, and when he’s running through the city in his hospital gown, the Sapphire plug-ins were invaluable. So many times they gave a zing to the scene. There’s a shot from inside a microwave oven where we added a grid [over the picture] and warped it. The warps were Sapphire warps. There were some freeze-frames of Carlos with a kind of glow look done with the Sapphire. Those all translated perfectly in the conform. I got a list from Sapphire of who in Los Angeles is licensed for the plug-ins, and LaserPacific was one of them.

Did all your edits conform without hassles?

We ran a little test beforehand. We had a couple of vendors conform a couple-minute sequence with a variety of different split screens and other effects to see how well they translated. 95 percent of the stuff was fine. The animattes had to be redrawn, but it was usually just a rough swipe or highlighting a section of the frame. It was pretty easy to do those. The 3D moves didn’t translate with all the moves intact. Some of them did and some of them didn’t.

Did you test the image quality, too?

Just to see how things looked on the big screen. This was shot on HDCAM SR, which has 4:4:4 color space. We were concerned that the DS was only at 4:2:2, and we were curious how much resolution loss that was. So we conformed in both color spaces and went out to film and saw very little difference between the two. That meant we could use the DS Nitris without trying one of the new Bluefish cards. That workflow isn’t seamless. You have to hand-conform a lot of things because it’s frame-based instead of time-code based. So we were lucky we were happy with the look of the 4:2:2! There’s a little color fringing on intense red lights. Otherwise, it looked great. Having set up that workflow and tested it, we felt confident that anything we did in the offline would show up in the end.

How did you watch dailies?

Unfortunately, dailies were just seen on DVD by most of the crew. There were no screenings. I worked with a projector in my edit room. I worked off an eight-foot screen with dailies coming out of the Avid. We worked at 14:1 compression, which is normal. It was kind of a tight budget, and the directors felt very confident about what they were shooting. They were the operators as well. Each of them had a camera almost all the time, and they didn’t feel they had to screen dailies.

Did they look over your shoulder to see what you were cutting during the shoot?

Not very often. They were pretty busy, because it was a pretty fast shoot – 33 or 34 days. On the rooftop scene [at the end of the film], they were running out of light and had to reshoot and they didn’t feel they had gotten complete coverage. I cut it together and got it to them and they were quite happy with it. After that, they didn’t come in to see stuff consistently. I was having a lot of fun putting it together, playing around. A lot of things were goofy, off-the-top-of-the-head ideas like the backwards subtitle, or the subtitle on Chev’s head. We’re having fun here, it’s a very in-your-face and over-the-top film, so let’s see how far we can push the editing on it. It was all in good spirits for experimentation.

Did you make a conscious decision to push the envelope on Crank?

I approach the film, and the film says how it should be cut. This film said, “Go fast, have fun, be tongue-in-cheek at times, even in the editorial sense.” But I don’t think of it as pushing any boundaries other than the fact that the workflow freed up some creativity and allowed us to experiment more, doing things you might want to do but couldn’t afford on a lower-budget film. It starts with the camerawork, which is very loose and handheld. [The directors] abandoned their one Steadicam shot after shooting it because it was too smooth. All of the handheld work might offend some people’s sensibilities, but it inspired a lot of the editorial style. They had an incredible ability to whip-pan to something and land at the right moment and be perfectly still for the things you needed to see. I don’t think the editing is really experimental or too far out there. It was just having fun with things that had all been done before. Split screens go way back. But it’s the ability to do whatever you want and refine it while you’re working. Instead of turning it over to VFX, you’re living with it week to week.

So technology has influenced your style.

On Natural Born Killers [Berdan co-edited NBK with Hank Corwin], we were working on the Lightworks, and I remember high-speeding through the cut as you would on a KEM. I was watching the film go backward and forward at high speed and I thought it would be neat to do that in the movie – zip forward, time compression. I didn’t try it because … there goes another optical. But it’s easy to do that now. And if it works, great.

I found an interview on the Web with Mark Pellington, for whom you edited The Mothman Prophecies, where he mentioned your influence on the sound of that film. Is sound design something you try to engage with as an editor?

I actually thought I would take the path of trying to be a sound mixer, a dubbing mixer. When I first started, I was working in the Bay Area, which has a pretty good toehold in the sound-designer market, and that’s the route I thought I would go. But I got my first break as a picture apprentice, and sound is very important to me. Most editors try to pack their tracks as full as they can. This was my second show on the Adrenaline, and it’s great to have 16 channels. I thought eight was kind of liberating. I tried to design scenes [in Crank] with pretty good temp effects, and they used some of the effects I cut in in the final mix. Danetracks [in West Hollywood] was our sound-design house, and they provided us with some of their own libraries and I used some of that to shape scenes. We didn’t have a long final mix schedule, so it was good that we had worked out some of the sound design ahead of time. It was a pretty dense temp mix. I put a lot of woosh-bangs into scenes. Most of the split screens had a sound attached to them to give them some weight, and that stuff was in pretty early on. The shootout on the roof, where we go out of the full gunfight and into the TV announcer, and everything get squashed as if you’re hearing it through a TV, that was done right from the beginning. Sound is half the show.

What are you working on next?

The film I’m doing right now is a small, independent film in Seattle. We’re doing a film finish. It’s a very traditional process. I feel like, “Wow, it’s kind of limiting.” I’m doing this on on Xpress Pro – cutting the whole feature on my laptop.

Comment on this story at our forums.