Home / Creativity / Feature / Project/Case study / Technology / VFX/Animation

Heavy-Duty VFX Management for Battlestar Galactica

Tracking 100s of Layers to Render Huge Space Scenes on Deadline

Sci-Fi Channel’s hit space opera Battlestar Galactica isn’t your ordinary VFX show. The big effects sequences are carefully integrated with the story, and the stylistic mandate is jumpy, handheld camera that imparts a sense of immediacy.
Watch the footage, below, to see Atmosphere’s work (with and without commentary by CG Supervisor Andrew Karr), and then read the article to learn more about what went into managing workflow for Battlestar‘s biggest spectacle to date.

Watch the video …

"That's been Battlestar's motif from the very beginning – that very kinetic, almost battlefield camerawork, with the whip pans and the snap zooms," says Jeremy Hoey, co-founder of Vancouver's Atmosphere Visual Effects, which handles duties on the show along with the production's in-house FX department. "It really brought a new energy to TV visual effects. Sci-fi visual effects had gotten stuck in a sort of rut of long, sweeping camera shots of spaceships flying just overhead, that kind of stuff. Battlestar, from the miniseries, set a new tone. The camerawork is largely responsible for that."

Along with matte painter Hoey, Atmosphere's other principals and co-founders are Andrew Karr, who supervises 3D, Atmosphere's largest department, and Tom Archer, who supervises compositing. Using Lightwave for 3D animation and Digital Fusion for compositing, the crew takes its cues from the show’s VFX supervisor, Gary Hutzel, who specifies the look for each shot and provides detailed animatics illustrating the gist of every scene. For a Season 3 two-parter, “Exodus,” which aired October 13 and 20, Atmosphere handled scenes that included a space battle sequence taking place on an unprecedented scale. Atmosphere executed the sequence in Lightwave.
The animatics given to Atmosphere specified the basic camerawork and layout of each shot, even including some particle work. "But, obviously, their animatics didn't have all of the missiles and all of the explosions," Karr says. "Our shots have hundreds of raiders flying around, attacking the Pegasus, that were not in the animatics at all, and each individual raider was keyframe-animated. Some of the backgrounds could have been done using particles, but the foreground stuff was all hand-animated. So a lot of work had to be done, and those all had to get approval from the production."

The key to getting the work done on a TV schedule was keeping the compositing and CG teams working in tandem, Karr explains. "Compositing would jump ahead and work in some of their practical explosions," he says. "Then the shots would go back to CG, where we'd work on the lighting and try to enhance some of the composites. The CG department would put in some missiles and CG explosions, and comp would get those and enhance them with practical elements. It's about going back and forth, using multiple layers. We'd have as many as seven or eight CG artists and several compositing artists tackling different elements of a single shot."

Organizing the work requires foresight, because the CG department will be making decisions about what layers need to be rendered first based on whether the compositing department will be using them to start working on its own elements. "It's not just the time to composite and actually set up the renders," notes Jeremy Hoey. "The rendering time itself is extraordinary. We have almost 200 CPUs of render nodes, and they were going 24/7 for weeks just rendering layers for this one episode. It's hundreds of gigabytes of data. And that adds another factor of uncertainty, because if there are any problems with a render, that brings the entire process to a grinding halt. Suddenly everyone has to stop and wait for the render farm to chew through re-rendering those frames. Careful management of all the layers becomes really critical, especially as we get toward the end of our deadlines."

Atmosphere uses Muster, a render manager from Virtual Vertex, to take frame-specific control of its renders. The system allows precise adjustments to be made, depending on how busy the overall render farm is. "We have to be very creative, sometimes, in what we render and in what order," Karr says. "Muster is actually new software for us. We were using a different render manager last season, but Muster works quite well. It has its issues because of the size and the amount of material going through, but we wouldn't have been able to get through the show if it was flaky."

All of the renders are at HD resolutions – which means, Hoey notes with a little laugh, some of the old shortcuts no longer apply. "Blur, add lens flare – you could really abuse the shot in NTSC," he recalls. "You can't get away with that kind of fudging in HD."

"You'll have some fella at home watching this in HD on a 60-inch television, and it has to hold up," agrees Karr. "HD is very clear, and every little detail shows up."

And HD isn't even the upper limit for an HD show. "It's not uncommon that sometimes if we're working in HD we'll render in double-HD resolution," notes Tom Archer. "If we have very fine detail of antennas or other very small things, and we find that we're getting buzzing in those details, we'll just double the HD resolution to solve that problem. The sky's the limt, as long as you've got the rendering power."

No Comments

Categories: Creativity, Feature, Project/Case study, Technology, VFX/Animation

Curated By Logo