3D Workflow Featured Proprietary Software

The massive number of tapes resulting from multiple 18-camera shoots of U2’s 3-hour “Vertigo” concert tour was just the beginning. The production attempted to clone the original tapes as quickly as possible. Then the edit began. And 3ality Digital Systems got into gear to solve some of stereo filmmaking’s most persistent problems.

Related story: read about production on U23D.

“It was clear looking at all the footage that we had to do a serious pass at image correction, in terms of matching the left eye/right eye,” says 3D and digital image producer Steve Schklair. “We were very critical about zoom position, reflections, color and image size. We wanted pixel accuracy.”

Throw Out the Book
To fix the problems, the 3ality team was going to have to invent some new tools. “When we started U23D, there was an archetypal book of what you could and couldn’t do in 3D,” says 3ality Digital Systems CTO/COO Howard Postley. “Over the course of the movie, we pretty much disproved everything in that book.”

The verboten list that 3ality tackled included dissolves, fast cuts and visual effects like smoke and multi-layered composites. From the beginning, the goal was to build a toolset that would correct automatically as much as possible. “We used to do this frame-by-frame with a Quantel iQ or Autodesk Flame or Inferno,” says Schklair. “But we had to screen footage for the band, sometimes dailies, and we needed to correct as much as we could.” The result was software that allowed artists to do “3D leveling” with the PC-based workstations, running Windows XP. “By that, I mean it corrected aberrations,” says Schklair. “It didn’t correct everything, but a lot more than what ended up in the movie.”

Tweaking Convergence
Postley points out that the primary piece of software in the toolset, which is called 3action, is focused on the 3D image alignment, which includes compositing and effects – anything that is “3D-critical.” “The tools allow you to change the convergence point in the shot,” he says. “Let’s say when you shot in 3D, Bono is at screen plane, Edge is behind and Adam Clayton is in front of screen plane. In post, the director may want to shift it so Edge and Bono are both behind screen plane and Adam is at screen plane. Our tools allow us to do that.”

The toolset also allows for multiple convergence points. “This is something that doesn’t make sense at all in 2D,” says Postley. “You can have not only multiple 3D layers, but each one of the layers has a different focal plane or convergence point. If I took a shot of Bono, a shot of Edge and so on into editing, I can cut up the images and layer them to make them look like they’re standing in the same depth in the screen. It’s a 3D effect for which there is no 2D corollary.”

Massaging the Edits
Edits in a 3D movie can create real problems. As the cut moves in one frame, or 1/24th of a second, from an image on one plane to an image on another plane, the brain struggles to keep up. After dozens of these cuts, motion sickness or a headache results. 3ality Digital Systems had a plan to avoid this problem, and in the process also allow dissolves and fast cuts.

“We had a stereographer who worked much like a colorist does,” Schklair says. “He did 3D depth control, which allowed us to transition the depth across each edit. Within 12 to 14 frames, we move the depth to match a mid-point of the incoming shot. You don’t notice these depth changes because they’re so fast, but it allowed us to violate the rules of 3D editing and do fast cuts.” The “depth control” is a post-production process, part of the conform, and the stereographer controls the process in real-time to find the sweet spot for the dissolve or cut.

Handling color is another proprietary issue. “The way color is traditionally dealt with in 3D is that you take the dominant eye, color-correct that and then apply those changes to the other eye,” says Postley. “It turns out this is the wrong thing to do, because those two cameras aren’t in the same place, so the light doesn’t hit them the same way. You need to introduce the same color characteristics that the eye actually sees.” Postley won’t, however, reveal any details of how the 3ality system does that. “It’s too proprietary,” he says.

Other tools that were used included Assimilate Scratch, Iridas SpeedGrade, Shake, Nuke and After Effects. “One of the pieces we don’t make and don’t intend to make is a 3D editor,” says Postley. “We think it’s distracting. We believe that while it doesn’t make sense to edit in 3D, it does make sense to look at what you’ve done in 2D as often as possible in 3D. Our rapid conform tool will pull the 3D elements into place, so you can either work with them or just view them.”

The infrastructure to support post was as massive as the number of original tapes cloned. With one petabyte on an Isilon cluster, the workstations were connected via 10 GB Ethernet or 8 GB fiber-channel. “We added more 10 GB Ethernet towards the end,” says Postley.

Since there aren’t many 3D tools – or, to date, many 3D movies – there’s a dearth of skilled artists who can do in stereo what they’re used to doing in 2D. “It is an ongoing battle,” says Postley. “We have people who are very, very experienced in working in stereo with all the nuances. We have others who are very talented, who started off in 2D and have learned a lot about 3D from working on the movie. To get them to the next, highest-end, level is challenging. That’s why we’re building more automation into our tools, so the operator can focus more on what he wants to do rather than how he wants to do it.”

Related story: read about production on U23D.