Journey to the Center of the Earth Pioneers a Live-Action Stereo Workflow

Journey to the Center of the Earth, directed by Eric Brevig – a filmmaker with a VFX resume that includes Total Recall, Men in Black, and Pearl Harbor – might not have the same impact as The Jazz Singer (the first feature-length “talkie,” in 1927) or Becky Sharp (the first three-strip Technicolor feature, 1935), but as the first live-action VFX-driven feature film to be released in the 3D format, it can’t help but break new ground. As production shot with the Cameron/Pace Fusion 3-D camera system – essentially two customized Sony F950 HDCAMs outfitted to work in tandem for stereo photography – the editorial team had to figure out techniques for stereo playback of dailies, getting 3D footage out of a very 2D Avid system, and assembling and exhibiting a 3D director’s cut for the studio executives. F&V talked to editors Paul Martin Smith and Dirk Westervelt, 3D consultant and VFX editor Ed Marsh, and HD engineer Fred Meyers about pulling everything together in production and post. Read the story, below, and check out our Q&A with Meyers about what it takes to make a 3D production work.

For more on Journey to the Center of the Earth, see Studio Monthly‘s story on the VFX work.

3d workflow chart

Breaking it Down: VFX Editor Ed Marsh provided this early workflow diagram to F&V. While the Journey to the Center of the Earth team was able to streamline and improve on this basic workflow, it shows the kind of extra thought that has to go into posting any 3D production. (Click to see a larger version.)

Recording in Stereo
On the set, the production was recording left-eye and right-eye streams to separate HDCAM SR tapes. Those 4:4:4 masters were both fed into a Sony SRW-1 portable HDCAM SR deck that recorded and played back a 4:2:2 stereo image. “That’s perfectly fine for review purposes,” says Marsh. But there was one wrinkle in the process. The production used beamsplitters, mirrored devices that allow two cameras to be positioned so that one can be pointing at the action while the other, pointing in a different direction, captures the same action in a mirror reflection, just as if it were positioned right next to the first camera. It’s a way of getting the interocular – the distance between the right-eye and the left-eye lenses – below what’s physically possible, given the size and bulk of the cameras and lenses. (It’s impossible for parts of two camera lenses to occupy the same space at the same time.)

For recording a 3D image to tape with the aim of on-location playback, that posed a problem. Because one camera in the beamsplitter sees into a mirror and captures a reversed, or “flopped” left-eye image. It’s not difficult to flip the image back in post, but it’s an extra step that adds time to the process. Instead, the production used what Marsh calls a “flop mix box” from Evertz Microsystems to flop the left-eye picture and record it, in sync, to the SRW-1. “I can’t take credit for its creation, but I was one voice asking for it,” Marsh says. “I worked on two [James] Cameron documentaries where we were field-shooting on a Russian research vessel, and engineering where I was located was five decks down from where the screening room was located and it simply wasn’t practical to see the live recording in 3D while also monitoring everything else. I got very good at judging 3D by fast-toggling the [left-eye and right-eye] images and seeing what the discrepancies were. You can’t do that when one eye is flopped. I sent an incredibly simple diagram to Evertz, and they were able to modify some existing gear to give us what we needed – a box that “normalized” the eyes and kept the imagery in perfect sync that also provided a fast toggle or 50/50 DX of the images for 2D evaluation of the the 3D images.”

Dailies were played out from the SRW-1 and projected in HD. “The one frustration is that the SRW-1 was the only deck that could do it, so you had to play back your dailies from a portable deck that was just barely RS-422 controllable,” says Marsh. “To use it in post was not possible.”

Editorial worked in a 2D environment using right-eye footage only. (The right eye was significant because it was never shot through the beamsplitter, which not only flops the image, but also can introduce some softening or warpage that would need to be corrected in post.) “I spoke early on to Charlotte [Huggins, producer] and to Eric, saying, ‘OK, I’ve heard all these rules for editing 3D,'” Smith recalls. “”What do I need to worry about?’ And both of them said, ‘Nothing. Don’t worry about any of it. Cut for story.'”

It might sound a little scary to edit a 3D film in 2D, but Smith says one of the biggest confidence-boosters was the show’s ability to actually screen dailies on a daily basis. “We went back to watching rushes at lunchtime,” Smith recalls. “When was the last time you heard of a picture doing this? Eric said, ‘We have to watch rushes on a very big screen in 3D.’ That was essential. It dictated the use of certain shots that I might not have used if I had only seen them in 2D. It was great to be sitting next to my director talking about certain sequences, whether we needed the second unit to pick up shots or whatever. It’s been a while since I’ve been able to do that. On films these days, everybody gets their rushes on DVD and people don’t communicate any more. It’s a very sad state of affairs.”

under water

A Director’s Cut in 3D
The film stayed in 2D until it came time to assemble a director’s cut to show the studio. For obvious reasons, Brevig was keen on that version of the film being screened in 3D, with some kind of VFX already in place. That’s a tall order on a film that’s full of blank blue-screen footage. “The budget on the movie was not huge, and it’s not like we had VFX people doing temps all the way along,” recalls Westervelt. “The director wanted to fill every background and have some temps comped in to a level I’ve never seen – even on a 2D VFX movie. When Narnia goes to preview, the creatures have green leggings on and lots of blue-screen backgrounds are still in there. For a director’s cut, you usually don’t fill in every single thing – but that’s what he wanted.”

To make it happen, Smith and Westervelt looked to the previs animatics created by David Dozoretz, the second-unit director and senior pre-visualization supervisor. Dozoretz re-rendered the backgrounds without characters in the foreground so they could be repurposed as background plates. “The first thing we tried to do was just split the backgrounds out a bit, putting them behind our blue-screen plates in 3D space,” Westervelt explains. “In most cases it worked. If there was something like the ocean with piranha fish in it, moving from the background to the foreground, it might be impossible to get it to work. But our foreground plates would have real 3D, and the background plates would look like a flat background, just far enough behind that they didn’t look like they were embedding into our foreground plates.”

Underground

Faking Out the Avid
And then there was the question of how to make the existing 2D gear function as part of a 3D pipeline – this was all happening in the days before companies like Assimilate and Quantel started announcing support for stereo workflows. “We had no critical 3D screening in the cutting room, so we were ballparking a lot of it – just kind of guessing,” Westervelt says. “You could put the pictures side by side and look at your monitor through a wheatstone viewer [a handheld gadget that superimposes a left-eye and right-eye image for easy viewing]. Once we thought we were in the ballpark, we would output the right eye to a [Sony] SRW-5500 deck. Then we would load the left-eye timeline into the record side, and then do another digital cut from the Avid with the left eye – only this time you do a preview edit instead of a destructive edit.” The idea is that the Avid will roll the tape as if it were recording, but won’t actually write to the tape. This editorial fake-out allows you to simultaneously play the right-eye stream out of the edit-controlled SRW-5500 and the timecode-synchronized left-eye stream directly out of the Avid.

Then you need something to capture both streams. The SRW-1 can do the job, but Journey opted instead to use a digital-cinema server, the Quvis Acuity. “That’s the best solution, because you can get it to trigger and capture the right timecode,” Westervelt says. “We would capture the two HD streams and remarry them into one 3D stereo video file that was available for playback. That’s how we built that first 3D assembly of the director’s cut.

“The only other way to do it would have been to go to an online facility, give them an EDL, and have them bring everything in and do the comps with the backgrounds. It would have been prohibitively expensive and time-consuming.”

The Acuity was an important piece of hardware in the VFX review process, too. “We started stitching together 3D assemblies fairly regularly,” says Marsh. “I got it down to a minor science using a FileMaker database and the Acuity. I made it jump through hoops it hadn’t been designed to jump through. But it worked, and it allowed us to get through our process very economically because we weren’t dependent on a bunch of other machines – DI systems that required three or four people to operate.” VFX vendors delivered QCC clips, which were dropped into the current cut of the film, married to audio, and reviewed in 3D.

How much more complicated did 3D make that process? “I’d say, on average, every shot went through at least three additional iterations just to dial in good 3D,” Marsh says. “Some vendors were better at that than others, and the ones who were better at it had taken some of our advice to heart and gotten large screening environments into their workflow. Working big reveals your errors early on. You might create something that, on a little screen, works perfectly in 3D. But then when you amplify that to the large screen, you realize just what your margin of error is. It’s micro-pixels. You might just need a tiny adjustment, but it makes all the difference in the world.”

In fact, the real problem with editorial wasn’t that it was taking place in 2D, but that it was happening in SD instead of HD. “We should have been cutting in HD. It would have made the whole post process easier,” says Westervelt.

Marsh agrees. “My one regret on Journey was that we chose to be in a standard-def metaphor for the show,” he says. “We had to go through two extra steps to see our work on a big screen in 3D. When we did the director’s cut, we conformed it in HD on a Nitris and then we brought in the other eye and basically redid all the temp visual effects work from one eye on the other eye. It was an extra step that had to be done to get us to screening. With the newer codecs from Avid and the cheaper storage, it isn’t necessary. If I were doing a 3D project now I would not do standard-def at all. In fact, I wouldn’t do any film in standard-def. We have the technology now so that HD is not a performance hit, and it gets you that much closer to reviewing your work on the big screen.”

A long climb

Cutting for Story
Despite the intricate workflow – Marsh likes to quip, “3D is easy – just do everything perfectly twice” – Journey was cut, much like any other movie, on an Avid Adrenaline by editors who were thinking about story, not depth effects. There were a few considerations, however. “There are certain wide shots that you tend to hold on just a bit longer, because there’s so much more to look at,” Smith says. “Action sequences were cut just as fast [as in a 2D film]. If you put the image forward of the screen in one shot, when you cut to the next shot you wanted to make sure you weren’t flinging people’s eyes out to the side compared to when the image was behind the screen. There are ways of pulling convergence in post, but if you had a fast action sequence you became very gentle with all that stuff. Otherwise, people would be virtually throwing up.

“We do have the typical 3D trick shots, and someday we’ll all get over those,” Smith continues. “It’s very much like the train going toward the audience in the Lumià¨res’ film. You do it once or twice and everyone is very excited, but they eventually get bored with it. We did play some of those games, believe me. But on the whole, the 3D experience is about the beauty and, more to the point, it’s unbelievable how – even me as the editor – you feel like you’re there. You’re out in the mountains of Iceland, and you feel like you’re there.”