Editors Joel Negron and William Goldenberg on Making a Michael Bay Film

Michael Bay knows what he wants from a Transformers movie: big effects, beautiful women, and a cocky kid who knows how to talk to giant robots. Working on that movie means getting into a Michael Bay frame of mind, understanding how all the disparate parts of the project are, over time, coming together in a two-and-a-half-hour burst of highly choreographed and very expensive metal mayhem. For editors, that means using the script and previsualizations as starting points, then building the picture scene by scene – even as the evolution of VFX shots opens new directions in storytelling and the effects of 3D help dictate the editing rhythms.
“On other movies, the previs is pretty much a map showing how it all goes together,” says Joel Negron, who edited Transformers: Dark of the Moon along with William Goldenberg and Roger Barton. “On Michael Bay movies, it’s just an idea. He’ll go out and come up with ideas while he’s shooting plates, or while he’s looking at the footage. We put it together the best we can, gathering information from the script supervisor, from the VFX producers on set, and from Michael.”

Bay takes a look at the first pass at a scene and tells the team what he thinks, and then they tackle it again based on his feedback, but this time with some crude comps available from the VFX teams for cutting into the scene. Of course, it will change again based on that new material. Bay’s presence drives the process, which is updated with each new idea thrown into the mix. “One time,” Negron recalls, “we were looking at the blank plates and Michael said, ‘Do you see the robot there?’ And you kind of do, although you’re not really in his head. But [the scene] gets built as time goes by.”

Editors in Santa Monica, Director on Location

Negron and Barton have worked on various Bay projects for years; this was Goldenberg’s first outing with the director. Editorial work took place on Avid Media Composers at Bay Films in Santa Monica, even as Bay himself worked on location in Chicago. The whole project was being shared off of a single Avid Unity MediaNetwork across editorial operations in two different buildings, plus a third building housing the film’s 3D department, where the editors could look at sequences in stereo in a small screening room. First Assistant Calvin Wimmer “ran the [cutting] room,” said Goldenberg. “He is part post-production supervisor, part VFX editor, and part Michel Bay’s right-hand man. It’s hard to emphasize enough how many hats he wears and how valuable he is to all of us.”

When Bay wasn’t in L.A., he was using a laptop with an external hard disk loaded up with the project media to stay tightly in the loop. “We used iChat screen-sharing a lot,” explains Negron. “We could share our screens, and Michael would run sequences and give us notes from Chicago. He could scroll around and we could see everything he was doing. It was so much better than talking on the phone. He would make his selects and we could transfer a bin instantaneously from iChat right into the Media Composer.”

“He was even able to do stuff on his plane,” says Goldenberg. “He has so many jobs to do as director on a movie of this size that we needed to save time and get the maximum use out of him that we could.”

Cutting for Stereo 3D

Michael Bay movies are known for a consistently fast cutting style – the director has publicly objected to that characterization, calling it an unfair stigma – but the use of stereo 3D for Dark of the Moon meant the editors were always aware of the need to slow things down a little. “We were constantly reminded: hold the shots, hold the shots,” Goldenberg recalls. “Instead of doing something in five cuts, do it in two cuts. We were very conscious of not having this movie be cutty. We knew it would be headache-inducing if it was.”

Truly breakneck editing speeds were out of the question, even in short bursts. “It takes about three seconds for your brain to realize [what it’s looking at on screen] and register the 3D effect,” explains Negron. “We were aware of that the whole time we were cutting. And we were always aware that people can get dizzy or get a headache with 3D. If it’s kind of half-assed, it hurts your eyes. So we dialed down a lot of the 3D to be almost 2D. You don’t really notice it, because you also have a lot of shots that are spectacular.”

About 60 percent of the film was shot in 3D, but much of the footage needed to be post-converted and was coming into editorial in 2D. One of the editorial assistants wrote a plug-in that allowed 3D and 2D footage to be intercut seamlessly in the Avid’s 3D mode, which let the editorial team easily view scenes containing a mix of material on their monitors. “You really want to understand what’s happening 3D-wise so you know you want to use a particular shot, or hold it longer than you might otherwise, so having 3D monitors in the cutting room was really helpful,” Goldenberg says.

“Working in 3D on the Media Composer, there’s not much of a learning curve,” Negron says. “It’s just another option. The biggest thing is just thinking about the finished project and how it’s going to look in 3D. The workflow was really easy.”

Editing Around VFX That Change the Scene

Editing a VFX-heavy movie is one thing – you’re cutting action scenes that, often, don’t have any action in them. But editing a Transformers movie means the performances of your film’s most iconic characters don’t even exist yet. To put it another way, says Goldenberg, “You’re cutting an animated movie without any animation.”

Working with often-empty or close-to-empty live-action plates, the editors would do the best they could to figure out the action and spatial relationships in a given scene and cut the footage to match. But as shots came back from ILM and Digital Domain, the nature of the footage would change. “It’s like you’re still shooting” during the editing, Negron says. “You get things back from the VFX company and say, ‘That’s a new shot – where do I put it?'”

“You’re cutting the scene twice,” says Goldenberg, noting that the VFX houses have the status of creative collaborators on the Transformers movies. “They’re so clever about what they do and how they stage things. It’s not, ‘Put a robot here.’ They’re thinking of the story and how to make it better.”

As an example, Negron names the film’s Autobot character Sentinel Prime, voiced by Leonard Nimoy. In the early stages of editing, the team was working on rough shots with a temp voiceover recorded by one of the assistants on set. But once Nimoy started recording his lines, the VFX started showing up with his facial characteristics and movements incorporated into the face of the robot. “It was really, really cool to see how that character came to life,” he says.

Goldenberg cites the case of Bumblebee, one of the more popular characters in the franchise, partly because of his close relationship with the central human character Sam Witwicky (Shia LaBeouf). “The emotions are conveyed not so much in his facial expressions, but in what they can do with his eyes,” he says, remembering one close-up shot from late in the film where Bumblebee delivers a performance that, arguably, outstrips those of his human counterparts. “My brother-in-law really liked the movie. He called me from Chicago to tell me he teared up at the scene where Bumblebee almost gets shot in the head. To extract that much emotion from a scene like that is incredible.”

But it’s a fine line. At one point, the VFX artists experimented with adding a single tear to Bumblebee’s eye. Too much – at that point, it was time to dial the emotion back down.

Staying Flexible

Through the film’s continuous and relentless evolution, the editors found that they had to be extremely flexible in terms of which portions of the film they were working on. “We would take the scenes as they were ready,” Negron says. “Roger might say, ‘This scene’s ready. I’ll take it.’ And then I would take whatever else would come up. We had a folder called ‘To Be Cut,’ and you’d go to that folder and just take whatever scenes were there.”

Bay would occasionally assign a given scene to a particular editor, but the prevailing workflow methodology was first come, first served. Even then, the scenes would sometimes get handed off among the team. And that’s just one of the ways that such a huge project came together in a coherent fashion.

“Generally, when you do a multi-editor show, you get your scene and it’s your scene from beginning to end,” says Goldenberg. “With Michael, you always work in his room. You may do the first cut of a scene, and then you’ll hear the same scene playing in the other room – where somebody else is working on it. So everybody worked on everything. It took a little time to get used to, but we all trusted each other. You had different eyes on each scene, and ultimately it made the movie better.”

“That’s the only way when you’re working on a Michael Bay film,” agrees Negron. “The task is enormous. You have to just go with the flow and get in there. If you’re given a sequence, or someone else gets your sequence, that’s part of the process. It’s three guys using their combined experience to edit one movie, which is really great.”

All photos courtesy Paramount Pictures