Shooting and Posting Scourge With the GY-HD250, Final Cut, and Color

For Scourge, a low-budget horror film with the tag line “you are what it eats,” cinematographer Corey Robson employed a pair of JVC ProHD camcorders to capture high-quality, high-definition images – his producer's GY-HD250 was the main camera, and his own HDV GY-HD100 served as B camera – and then broke out a copy of Apple’s Final Cut Studio and Color to edit and finish the movie. We talked to him about the good, the bad, and the buggy. (And watch the trailer below.)

The trick to keeping image quality as high as possible was making use of the HD250’s HD SDI output to bypass the heavy HDV compression applied when recording to MiniDV tape. Whenever possible, Robson used footage that was recorded to a Wafian HR-1 recorder and used the camera’s HDV tape recordings only as backups. This wasn’t always possible – sometimes the crew wouldn’t actually be carrying the Wafian unit, but would still want to steal a quick shot, so the on-board HDV recording capability was welcome. And the HD100 camera was always recording only to tape, which means a lot of ProHD footage ended up in the movie.

Before the shoot, Robson ran tests to compare the image quality of footage recorded to the HR1 using the CineForm Intermediate format to footage recorded to tape and then captured into the CineForm codec. “CineForm smooths out some of the 4:2:0 color space you get with ProHD,” Robson told F&V. “If you don’t use it, the chroma sampling is not great. It does a nice job, and it was close enough that we could have gone either way – we could have shot to tape and skipped the Wafian. But it did help in some lower-light scenes. It was an asset, but it’s a big, 90-pound unit. And it’s an extra piece of gear that you always have to have plugged in.”

Those gains in low-light shooting were important because Robson used a Redrock Microsystems M2 cinema lens adapter that helped preserve film-like depth of field but reduced light hitting the sensor by about a stop. (Robson rated his camera at ASA 200 for the shoot.) “It was a very tight schedule – 24 days of principal photography and a very ambitious shooting style – so we just couldn’t use the adapter some of the time,” Robson explained. “We would go with a stock lens and bite the bullet and match the look in post. I’d add a little bit of film grain and use a plug-in look to match the ground-glass look the M2 gave us. We used three different film-grain settings, depending on how much light was in the scene, for any shot on the B camera or on the A camera without the M2.”

With the added film grain smoothing the image, the only telltale variance is in depth of field from shot to shot – much less noticeable in wide-angle shots than in close-ups. “You do what you can,” Robson say. “But it’s a one-third-inch chip, and there’s no getting around that.”

There were some gotchas, too. Because virtually the entire shoot was handheld, the camera took some abuse, which caused the lens adapter to vignette in a few shots. (Robson cropped them in post.) Also, the M2 was very sensitive in the backfocus department. “If you’re just a millimeter out, your image is going to be soft, because the adapter is already softening up the image,” he explains. “A few shots didn’t get used because they were soft, and it wasn’t the focus-puller’s problem. I probably slammed the adapter and knocked it out of whack – and you can’t see fine detail in the viewfinder.”

Recording Sync Audio in Two Places
The production workflow for audio was fairly bulletproof. “We fed the audio into the camera using a feed from our mixer, who was recording a back-up to a hard drive,” he explains. “We had the SDI cable and two audio cables coming into the camera. We decided to accept that – I was operating, too, so it was a pain in the butt, but it allowed sync audio on our HDV backups, plus the audio gets embedded in the stream so that it ends up on the Wafian, synced with camera-generated timecode.”

Creature effects were executed by Artifex Studios in Vancouver without the benefit of green-screen shots. After consultation, the crew told Robson they could rotoscope the little buggers into frame effectively enough. “It was their choice,” he says.

With all of the footage captured, or transcoded from HDV, to the CineForm codec, Scourge was ready to go into post at Artifex. Using Apple’s Color application was a classic good-news/bad-news situation. On the plus side, Robson says, is the application’s powerful power-window and vignetting functions, along with robust motion-tracking. The downside was that Color still feels very much like first-generation software. “It was really buggy,” Robson says. “It’s counter-intuitive. We couldn’t have had a more high-end system, and we still couldn’t work around some of the problems Color presented. I probably ended up re-timing parts of the movie three or four times because stuff got lost and I couldn’t reconnect. It would just crash.”

Asked for specifics, Robson cited a particularly maddening bug that appeared when he tried to import multiple video tracks from Final Cut. During the export back into Final Cut, Color was apparently adding a “horizontal stripe” of leftover material from the previous shot across the bottom of the first frame of random clips. To fix those problems, Robson and his editor had to go back to the CineForm files and re-export the problematic files directly into Color.

“We did the entire movie in Color,” he recalls. “In hindsight, I would have done basic color-correction and color-balancing in Final Cut, and then gone into Color for just a few scenes where we radically changed the look from what we shot. We couldn’t have done that in any other program.”

Lessons Learned
On his next project, what else will Robson do differently? For one thing, he hopes to use a new 16mm lens mount from JVC that could obviate the need for the M2 adapter. He’d also like to leave the bulky Wafian recorder behind. He’s working on a compact, custom-built hard-disk recorder to capture CineForm Intermediate video directly. Finally, he’s looking at workflow possibilities that might use BitJazz SheerVideo. “The best-case scenario right now would be to capture to CineForm and then convert all that into SheerVideo for smooth editing on the Mac,” he says. “I’m also going to do some tests with ProRes HQ.”

But, given a restrictive budget, he appreciates the JVC ProHD cameras on a practical, bang-vs.-bucks level. “My day job is DIT [digital image technician], and all the DPs seem to be buying these [HDV] cameras as little crash-cams and stuff,” Robson says. “When Scourge came up, our camera budget was really tight, so we didn’t have enough money to rent an F900 or anything like that. This was the next best thing.”