What We Did: Blew up a starship and its neighbors with a rogue nuclear warhead on the SciFi Channel’s Battlestar Galactica
How We Did It
Working from the script line, "The Ship (Cloud Nine) explodes in a nuclear fireball, destroying several other ships along with it," we blocked out a sequence of shots that we felt would tell the story within a reasonable time and budget. In this case, it was two shots. Producer Steve Kullback and I presented a bid to the production company based on our estimates and conversations with Gary Hutzel, the visual effects supervisor for the show. We make sure the proposed sequence hits all the story points, and Gary gives us any preliminary notes from the show’s producers. In this case, we learned the producers wanted to have a piece of debris strike the camera, which leads into a crucial transition point in the episode.
Once the sequence was awarded, we moved on to animatics. Using our notes from the conversations with the clients, I created two rudimentary shots in NewTek’s LightWave. The ships, environments and effects are all to scale, and although the general lighting direction is correct, the animatics contain no final textures or elements, save the camera movement. The animatic is the first visual representation of the shot, and it includes general timing and frame counts for use in the rough cut of the show. While I hammered out the timing and camera in the animatics, modeler Steve Graves got to work ripping up the high-resolution Cloud Nine model. Steve broke the ship into jagged layers (again in LightWave), and created morph targets for bent ribbons of hull plating and a crumpled superstructure. Artists Sean Jackson, Gabe Koerner and Geoffery Mark each did the same to three surrounding ships, queuing each event to the time my animatic shockwave hit their vessel.
When the production company signed off on the animatics, we were free to finish out the shot. Lead Artist Mark Shimer lit and rendered the Cloud Nine model, and added large-scale debris in LightWave as the ship broke apart. When they were done choreographing their destruction, Geoffery, Gabe and Sean lit and rendered their ancillary ships. All passes are split out into Raw RGB, Diffuse Shading, Specular Shading, Fill Shading, Self Lighting and Interactive Lighting, which gives latitude in the final image without a costly rerender. Artists also supply depth mattes for layering 2D elements into the 3D space of the shot.
Lead compositor Lane Jolly put the shots together using Autodesk Combustion. Because he rebuilds each frame from its component channels, Lane is able to make broad changes to the look directly in Combustion, and can bring out fine details in the renders that show up when viewed in high definition. While Lane was putting together the CG layers, I created 2D elements to augment the 3D effects passes. In Adobe After Effects, I layered several custom fire, pyrotechnic and CO2 filmed elements over the original animatic, scale and motion tracking to match the element they will augment. For additional debris, smoke and fire, I created tracked elements with Wondertouch’s particleIllusion. Because particleIllusion can generate large numbers of sprites very quickly, I could create details like the giant dome shattering into millions of glass shards without resorting to another costly 3D render. We render out the many 2D effects elements individually with animation and blur, and Lane imports them into the final composite as raw footage. ThisÃ¢Â€Â˜pre-composite’ method serves two purposes: It removes some of the pressure on the 3D artist to generate realistic effects in 3D, as the elements get mixed in with real footage, and it lets the compositor focus on the final image, not on selecting and placing dozens of effects elements.
The final composite adds additional camera shake, focus pulls and film grain to match the live-action surrounding the effects. The final frames get sent to our Frame Thrower real-time 2K playback system, where we view the shot on high-definition picture tube and LCD monitors to ensure proper gamma adjustment and note any additional changes. VFX Coordinator Liz Alvarez creates QuickTime versions of the shots, and sends them via FTP to the production company for approval. Once approved, Liz, Editor Dmitri Gueer and I/O Manager An Dang layoff the HD frames onto D5 tape for final delivery, and again for in-house archiving.
Culver City, CA