CafeFx's 3D Gore Fest for The Final Destination
How The Foundry's Nuke and Ocula helped push the film's FX over the top
“Nuke has a really strong stereo pipeline, which made it right for this project,” says Williamson. “Nuke’s been in the building here since version 4.0. We’d experimented with it on various shows, especially for shots that seemed to lend themselves toward it. We still used a little bit of Fusion on Final Destination, but it was more on a support basis for our dailies system.” Williamson says he used both packages to handle some spatialization work but decided, “after looking at the dual-view workflow and stereo capacity built into Nuke, and knowing what we’d be dealing with on this show, that it was our ultimate package of choice. And I don’t regret that decision one bit.”
For Williamson, it all came down to the brilliant way Nuke uses node-based control. “Compared to other systems, which are not elegant at all in terms of the workflow, the Nuke workflow is well thought out,” he says. “Nuke’s View paradigm, and the whole setup for working on Left and Right eyes with a single set of nodes that can control each channel independently, yet with shared capabilities, was the killer.”
Williamson says it was when he saw a demo of Ocula at SIGGRAPH last year that he solved the next piece of his workflow puzzle. “Once we got going on this film and I made the final decision to go with Nuke, I contacted The Foundry about getting into the beta program and was quickly accepted,” he says. “It was critical on this particular project. When it came time to use it, there was just no substitute. If I’ve got a vertical alignment to do, it can take me an entire day to correct a shot with the old brute force method. Ocula can do it in seconds.”
The Ocula workflow inside Nuke 5
CafeFX used the beta version of Ocula during the initial part of post on the film, when effects artists at the facility had wider access to multiple seats. “But there was a limited capacity once we were on the official release,” says Williamson. “We only had two licenses for that second push of production, after they finished reshoots, so we had to share and be picky about what we used it with.” The reason, he says, is the plug-in’s cost: $10,000. “When it came time to buy it, we were a little shell-shocked by the price tag. I understand it’s a niche product, but if you have all your artists fighting over it or scrambling for the one site license you can afford, it ends up not getting used and becomes a liability.” The Foundry, he says, took their concerns seriously and have since hooked them up with a more practical site license.
Amid other 2D projects, CafeFX has yet another stereoscopic film in the pipeline right now. It wasn’t shot in stereo, however. “It was a single camera shoot, so we’re adding the depth to it,” says Williamson. Although Ocula won’t be needed for this particular film, Williamson says he’s sent some test comps to The Foundry with the hope that they can modify some existing tools “to make our work building out the 3D depth a little easier. I’m trying to come up with a way of adding depth to the shot using Nuke’s iDistort node, which is in essence a virtual Z channel, something created by us. We’re also experimenting with a re-projection technique. I don’t know yet if it’s something that’s going to be rendered in 3D and fixed in comp, or if we do the entire thing in comp. There’s also the possibility of bringing in the geometry and projecting it and doing the entire depth process in Nuke. We haven’t yet settled on which one is going to be our best workflow.” The facility is also bidding on another “major 3D project,” he adds, and “Ocula will play a huge role in that project if we get it.”
Working in 3D brings with it a unique set of obstacles, often jarring to those trained in 2D workflows. “3D can add a lot more time to normal workflows,” Williamson admits. “Some of the things that are normally very, very simple, that you don’t have to worry about, like a simple paint out, suddenly become more complex in 3D. For Final, we had to paint out a wire that was in front of a sofa in the background, with a pattern all over it. The human eye is very good at picking up discrepancies like that. And the minute you have any kind of shift and are only one pixel off, it lifts it or pushes it back in stereo space. If you patched it on one eye, you’d never know, and someone can make that fix in an afternoon. But in stereo, it takes three days.” It also works the other way, too, he says. “In 2D, you struggle for days to get some depth into the frame, by diminishing some parts of the comp and bringing others up. In stereo, you get it for free. You know what’s in front of what.”
As for the enduring success of The Final Destination in front of audiences worldwide, Williamson says that’s easy to figure out. “This film wasn’t about the immersive 3D experience, which we’re expecting with upcoming films like Avatar. But it’s definitely got a ‘wow’ factor,” he says. “If you’re selling tickets, is that such a bad thing?”
Read more about how The Final Destination was shot in 3D here. For more about CafeFX, visit their site here.