JC Bond, Additional Editor, on Wrangling 10,000 VFX Shots for 2D, 3D and IMAX

Working in the editorial department on Alice in Wonderland, JC Bond was the point of first contact for principal photography, assembling scenes as quickly as director Tim Burton could shoot them. He was also the gatekeeper for the film’s extensive visual-effects content, which he received and analyzed before it was cut into the movie. Working on a team led by film editor Chris Lebenzon, ACE, Bond made sure thousands of pieces of content fell into place as expected in the editorial pipeline – in stereo 3D, no less. Film & Video asked him about the challenge of managing massive quantities of footage, and found out why Alice might look ever so slightly different depending on where you see it.
FILM & VIDEO: You’re credited as an “additional editor” on Alice in Wonderland. What were your responsibilities?

JC Bond: On a project this long, my responsibilities changed during the project. During the shooting phase, my initial responsibility was to assemble the scenes as they were being shot. One of the advantages we had on this project was that the movie was shot digitally for the most part. The bookends, which take place in “the real world,” were shot on film. The stuff in Underland, or Wonderland, was all shot digitally. We were connected directly to the shooting stage through servers, and we were working about 20 minutes behind camera. Right after they wrapped a particular set-up, we would load it into our [Avid] Media Composers and start cutting. My main responsibility during that phase was to assemble the dailies as they came in and have a version of the scene for Tim Burton to look at.

So you were making a rough assembly.

Sometimes that became the assembly. It depends on what happens. Some scenes got worked quite a bit more afterward, and for other scenes, what we got on day one was it.

How did your job change during post-production?

During post, my main responsibility was to receive the VFX shots from our facilities, analyze them, see if the artists were doing what we asked them to do, and cut them into the movie and have them ready to present to Tim Burton. That’s basically the breakdown between our two phases. On a project this long ‘ for me, the job was 17 months, but the shooting was only two and a half months. The rest was all post-production.

So it was an extended process.

In reality, on a project of this size, that was short. We would have liked another year. We would have settled for another six months. We got neither.

How did shooting the film in 3D affect your work?

The reality is that our movie is not shot in stereo 3D. It was shot in 2D and then converted during the VFX process. The majority of our movie happens in Wonderland. In Wonderland, we used computer-generated backgrounds and computer-generated environments so all we were shooting was elements. Our entire main shoot was a gigantic green-screen shoot. The stereo process happened in post-production. We converted every single shot. Because the environment was computer-generated, it was created in 3D, and the elements that were [shot in] 2D were post-converted. We did have to manage stereo media from the moment we started getting VFX back from the facilities, and that was a little over a year of handling. But not during the actual shoot.

So when you made those first assemblies, did you composite different elements together to get an idea of what the shot would eventually look like?

We had a very rough background that was created and manipulated during the shooting, to give Tim Burton an idea of where he was. We used that as a reference for our cutting, but only for reference. We cut with the actual green-screen elements. We used a virtual set. The set was created in pre-production and then there were tracking markers and sensors on both the camera and the actors. As the camera moved around the green-screen set, the virtual set moved around [the actors]. So we might have a green-screen element of the Red Queen with a temporary, computer-generated background that we could use as a live composite.

You have visual-effects editing on your resume alongside conventional film-editing jobs. Did that experience help you deal with the scope of this project?

Definitely. This project was two and a half months of real photography and over a year of visual-effects work. It was more visual effects than anything else. Cutting with green-screen, you have to keep in mind what’s possible and what’s not. Having that background was certainly helpful.

When you started making those assemblies, how did the stereo depth come into play? When did you first see shots in 3D?

Our initial pass was 2D. As soon as we had a 2D version that we were happy with, or that was even going in the right direction, we started looking at material in stereo. The first pass in stereo was a complete pass over the entire scene, and it was primarily to decide depth information ‘ how big we wanted the set to be, how separated we wanted the elements to be. We were looking at all that in stereo using the ability of our Media Composers to keep both left-eye and right-eye together and displaying it on several stereo-capable monitors in the cutting room. Obviously, a monitor can take you only so far, and from there we went to screening rooms.

Alice in Wonderland 3D editing

What was your workflow like in terms of the configuration of the cutting rooms?

We had two cutting rooms – one in L.A., and one in London. The VFX houses were in L.A., so we had material being loaded into our systems in L.A., replicated to London, cut and worked on in London, and then transferred back to L.A. for feedback to the VFX facilities. For the shoot we were all in L.A., and then for the first half of post we were all in London. In the second half of post, for at least six months, I was in London while the main editor was in L.A. And we all converged in L.A. for the last three months of the project. But we collaborated and worked like we were in the same cutting room all the time. We had a 100 megabit connection between us, so updating the media took less than 20 minutes every day. Once it was updated for the day, sharing cuts was instantaneous. The moment I would close a bin, they would have it in L.A.

The ability to have all our materials synchronized between both Unity servers was instrumental. Obviously, this set-up is overwhelmingly expensive for smaller projects, but it was part of ours from the beginning. Because of the eight-hour time difference between London and L.A., when one cutting room was shutting down and going home, the other one was in full swing. There were only about two or three hours a day we were completely shut down. The rest of the time, we were working.

What was the range of equipment in your cutting rooms?

At the height of the project, where we had the most number of systems, we had two Avid Symphony Nitris systems, eight Avid Media Composer systems, and four software-only Media Composers for a total of 14 systems.

Were the software Media Composers used for working on laptops?

No, they were on regular workstations. We used them mostly for exporting QuickTimes to other departments for sound.

Did you edit in HD?

Yes. We were using DNxHD 36 for the 2D material and DNxHD 145 for the stereo material. We used a higher resolution on the stereo material because in Media Composer the current solution for stereo is to put both eyes into a single frame and compress it. Since we were effectively halving the resolution, we wanted to start with a higher resolution to begin with.

Alice in Wonderland 3D Editing

What was the biggest impact of 3D on your editorial workflow?

The sheer volume of material. On your average movie that you’d consider VFX heavy, you’d have somewhere around 800 or 900 VFX shots. On a major VFX movie, you’ll have around 1,400 shots. We were just shy of 2,000 VFX shots. And this on a movie that’s a shorter movie – it’s only 110 minutes. To have 2,000 VFX shots in that amount of time is quite a feat. I talked to friends of mine working on other projects who would have a couple hundred shots in their entire film. We would get a couple hundred shots to review on a daily basis. I had 130 to 200 shots every single morning to look at and cut in and analyze. It was only 2,000 VFX shots, but we had 2,000 in 2D and then we had 2,000 in stereo IMAX and 2,000 in the regular stereo-3D version. Within the stereo version, we had left and right eyes to look at it and QC. You look at the shots in stereo playback, but you have to QC the eyes separately. And the 2D version of the movie was not just one of the two eyes. It was a separate version on its own. At the end of the day, when you add up stereo IMAX, regular stereo 3D, and 2D, we had close to 10,000 shots.

Were there different aspect ratios in the different versions?

No. We kept everything at 1.85. But the approach some movies have taken is to make the 2D version just one of the two eyes [from the stereo version]. We saw that to get the best 2D movie, we had to look at a shot in 2D and just analyze it in 2D. To make the 3D version of those shots, sometimes it was more convenient for that to be one or the other of the two eyes, but sometimes it was neither. It had to be somewhere in the middle. So we have a 2D version and a stereo version that are very close to each other, but they are not the same movie.

But it’s minor tweaks in terms of the camera’s perspective of different objects, not changes to the length of shots or edit points.

The differences are not creative differences. They’re slight differences of angle or framing. One of the things we did in a handful of shots was to remove some foreground elements from the stereo version, because we found them to be disturbing or jarring from cut to cut. The 2D version would have those foreground elements to make the shot look better, but the 3D version wouldn’t have them. Every version of the movie was analyzed on its own, even the IMAX version. There are slight differences between the IMAX version and the regular 3D version. We did not just do one movie. We did three complete movies, making sure each one was the best it could be.

Alice in Wonderland 3D Editing

What would be an example of the differences in the IMAX version?

An IMAX screen is so large that you barely see the edges of the screen on the left and right. You’re focused on the center of the screen, and those edges are way out on the periphery. In the regular 3D version, the entire screen is in your field of vision. So the IMAX version may have additional elements on the periphery that are not in the regular-theater 3D version. Virtually every shot was created in a 3D environment, so we had the ability to take an element out or put an element in or to change it slightly, moving it a little farther to the left or to the right.

A major distraction in 3D movies is the way objects that are close to the audience seem to be cut off at the edges of the frame.

Yes. That is more of a problem in your regular 3D theater, but not as much in IMAX.

Did the use of 3D affect the way the edit took shape? For instance, were there multiple iterations where a sequence would go back for a re-edit after the editorial team saw what the cuts looked like in stereo?

We were very cautious at the beginning of the process. We initially thought cutting for 3D was going to be more intrusive. But we found in general, to our surprise, that the only consideration was that we wanted to look at some shots longer. Once you started seeing some of these environments in 3D, they looked so amazing you wanted to see more of them. Under other circumstances, we might have cut out of a master shot sooner, but we wanted to stay on the master to look at the place. Other than that, we didn’t find any issues with cutting quickly, or cutting from one side to the other. At least from our experience, you get used to the fact that you’re looking at something in stereo. And then you just cut it like a regular movie. There’s no major difference beyond that. There are minor considerations. You can do some cheats in 2D where you may cross the line, and things like that that are a bit more jarring in stereo. But from a creative standpoint, you should try to avoid those things – even in regular 2D.