How Avid's ScriptSync Multiplies Storytelling Possibilities

Sometimes, obscure features of a piece of software can pay significant dividends to users who take the time to master them. In the case of nonlinear editing, one of the more powerful secret weapons is Avid Media Composer’s ScriptSync feature, announced with much ballyhoo at NAB 2007. Previously, Media Composer had a Script function, but it had to be painstakingly implemented by assistants who manually matched lines of dialogue with their corresponding takes. With ScriptSync, the software started doing the majority of the work, using speech-recognition algorithms to figure out which clips should be associated with different lines of dialogue from the script. With a new season of Showtime’s hit series Weeds set to premiere tonight, film editor David Helfand tells Film & Video how ScriptSync helps him tackle new episodes by tapping a six-year-strong archive of variant takes and line readings from previous seasons.

Film & Video: When did you start using script-based editing?

David Helfand: I was able to use it back in Media Composer v7 [introduced in 1998], when I was working on That ’70s Show, because the style of shooting for a multi-camera show is a little bit more conducive to manually setting up a script in a quick period of time. But, starting with v2.7 [in May 2007], Avid started incorporating the Nexidia voice and dialogue recognition. When it became automated, it was suddenly practical to use on single-camera shows, where the scripts and coverage are more complicated. And that’s when I started to develop a workflow that fit the time constraints of episodic series television. Since I started figuring that out, I haven’t really used any other, more conventional, approach in editing. Throughout most of my work on Weeds and any other shows I’ve done, multi-camera or single-camera, I’ve relied on ScriptSync 95 percent of the time.

So how do you set up for ScriptSync?

The manual set-up involves importing the text of the script – an ASCII text file that can be exported by Final Draft. You drag the clips out of a bin and stretch them out over the script in a way that visually mimics how it appears when you get a conventional line script from a script supervisor. Except it’s better. In the line script, if you have a certain scene or take that stops and restarts repeatedly, in any given take you might have line readings going back and fourth three, four, or even 10 times. But on the written script page, you’ll only have a single line showing that this take covers this range of dialogue. With ScriptSync, you can lay all those lines out and see precisely how many times a line may have been uttered in a given take. I look at the script supervisor’s version for notes and comments, but I don’t use it at all when I’m editing.

And previously, an assistant had to go through manually and mark each line of dialogue?

Modifications still have to be made, but now it’s closer to 80 or 85 percent automatic, depending on the quality of the audio and the consistency of the dialogue.

How granular can you get in terms of labeling and organizing those takes?

You can colorize some of the lines in the script to make it a very precise, color-coded map that can give you a lot of information that you’re not going to get from the physical line script. Maybe on the third pass they did an alternate, improv joke or a different line. Maybe on the fourth pass they skipped a line, so I might use a red color to indicate that a line wasn’t covered on that particular pass. Maybe there was an off-camera joke, or an ad lib that wasn’t recorded on a certain character’s close-up, although it was done in a wide shot. I can recognize those different variations immediately as I’m looking at the script. If the director says, “I remember a really good line reading on the third or fourth pass of that take,” I can access all 10 passes without physically scanning through them at high speed. If you don’t have slates to cue you, you have to scroll through the footage while the director or producer is sitting behind your back waiting for you to find the right one.

What’s the ScriptSync workflow like?

I spend time during the first-cut stage working in conjunction with my assistant with a back-and-forth, ping-pong workflow. He prepares the script for the available scenes and I start working. I’ll modify it and fine-tune and make annotations to the Avid script that help me understand the footage. He’ll provide the next day of dailies and give me a script that’s very close to ready, and we go back and forth and perfect it as we go along. As a result, I have a very accurate and useful tool to go in and mine the footage for all the potential things we need to.

What might that mean, specifically, for the show?

I’ve used ScriptSync numerous times to fabricate new lines of dialogue, moments or scenes by being able to access a script from two or three seasons ago. I might remember a certain line, word, or line reading. I can do a text search for certain words and, in minutes, tell my assistant not only what episode that word is in, but which original dailies – what tape, what reel, what tape number – and he can pull that out and digitize it. Within 15 to 30 minutes, he can reconstitute a scene from several years ago and give me access to dialogue I can use again to create a new moment in the current episode. That footage is no longer buried where you don’t think about it.

Can you give an example of a change you were able to make on Weeds?

In the season premiere, Shane [Alexander Gould] has just killed Pilar [Kate Del Castillo], this Mexican businesswoman, a power mogul. They had to structure a storyline showing how Nancy [Mary-Louise Parker] escapes from the scene of the killing without her husband, Esteban [Demià¡n Bichir], knowing what’s going on. They weren’t able to shoot those scenes because the actor playing Esteban was not available for this first episode. When we started editing, we realized how much his presence was missed. Having worked on the season finale last year, I remembered some footage from various scenes that would allow us to create a whole new scene. I dug into the dailies from the last episode and, repurposing footage from two entirely different scenes, created an interaction between two different characters that suggested the big Mexican crime boss was informed of this killing. I was able to do it quickly by going through the script and identifying takes I was thinking about.

Does that usually work for you? Do you have frustrating cases where you just can’t quite find the right material?

It works to the extent that anything might work in dialogue editing. In editing, you’re always trying to cheat things. Often you can put a word in someone’s mouth, or on the back of their head, and sneak it in. Maybe it’s a similar word. You’re doing that all the time. Or I can easily find variations of line readings and cut in certain inflections of that word. It comes down to basic dialogue-editing techniques. I can lasso a line of dialogue with the mouse and hear all of the variant line readings immediately, picking out certain syllables or inflections I can incorporate into the performance. I can access that same word from another episode, and that’s something the dialogue editors aren’t able to do.

Maybe it takes 10 percent or 20 percent more time from the assistant’s point of view, as opposed to conventionally preparing bins with the footage, but the payoff comes so often and in so many ways. You don’t need it all the time, but when you do need to go back, it’s great to have the ability to access six seasons of dailies as opposed to one episode. And it’s fun to see the first time a producer or director recognizes the power of it. It’s been frustrating to me that it hasn’t gotten farther along [in terms of industry adoption] than it has. I’m committed to it and I love it. Every editor who will allow me to bend their ear, I’ll talk to about it. To the extent that I’m kind of anal and I want to get every ounce out of the material that I can, the ability to reveal that stuff is critical.

Weeds Season Three Finale

In what other ways is your editorial skill set expanding?

At the end of season three, there’s a storyline where the whole town where Nancy lives burns to the ground. The director had a series of shots we were intending to treat with fire, showing how the town had burned down. In the course of editing, it occurred to me that the song was similar to the music from our main titles, which were well-known for showing various shots of suburbia, where the characters live in little houses. My idea was to do a reverse main title, where I took all those basic shots – we had fundamental back plates, with and without people – and use Boris FX fire plug-ins and things to set the main title on fire in a reverse fashion, playing against a version of our theme music that not only burned the town down but brought a close symbolically to that part of the story. After that point we weren’t using the main title anymore. I was able to enhance what the director had shot with additional shots from the main title by using visual effects, and I presented this to the producers. They had never thought of it, but they loved the idea and it stayed with the episode.

When you’re going to have some visual effects that interact with a scene, you’ll want to construct mock-ups just so you can get a sense of the timing. Some scenes this season take place at a fair. We were going to use set extensions with digital backgrounds and a CG roller-coaster, but I thought, let’s see how this could work. I figured out ways to comp in stock footage of roller coasters in various scenes, just to see how it would interact, timing-wise. It showed that they didn’t necessarily need to composite those shots with CGI, and since they were overwhelmed anyway, it lightened their VFX load.

Weeds is always a challenge because we’re bouncing back and forth, shifting the tone between dark drama and broad comedy, and we need to lighten moments and find ways to insert jokes into a show. There’s a very dark moment in this season when a character is put into a trunk. It’s very scary and violent, but we started thinking of ways we could lighten it to help pull us out and into the next scene. Because I was able to composite something on the Avid, I was able to convert a scene that was very dark into a joke moment. That’s the kind of thing that helps you with the fine balance between comedy and drama.