How Lit Made Stormy Weather for Frozen

When Tyler Hawes started his first independent DI company in 2004, the digital intermediate was still a relatively rare beast in Hollywood. “DI was just out of the experimental stages,” he recalls. “You would, perhaps, spend million dollars on a DI. And yet I saw that independent films needed it even more, because they can’t control light on a set like you can with a major motion picture.” In 2006, Hawes built on experience he gained working on early DI projects, including Superman Returns, and co-founded Lit, with a focus on DI work for independent productions.
Lit started with a FinalTouch system from Silicon Color, then graduated in 2008 to a Nucoda Film Master system from Digital Vision, which he says has given Lit a lot more flexibility with VFX work as well as color grading. On a recent project, the theatrical feature Frozen, Hawes found himself making creative use of the Film Master in order to help the filmmakers “direct” the performance of a composited snowstorm that they couldn’t quite capture in camera.

Even though time spent with a DI system is much more expensive for clients than hours billed on traditional desktop compositing systems, Hawes says advances in speed and power are increasingly making the DI suite the more efficient place to work on certain kinds of VFX. F&V interviewed Hawes and colorist Eliot Milbourne to get more information on how the process was broken down.

Film &  Video:  When you say you work on “visual effects,” do you mean only visual effects that can be executed within the DI toolkit?

Tyler Hawes: No. Every year I do one or two films where I also act as the VFX supervisor, like in this case, Frozen. We have some top artists who do nothing but traditional VFX work on the desktop with software like Maya, Fusion, Nuke, and Shake. Our focus is mostly on photographic-type compositing. We don’t do a lot of CGI modeling. That’s not our specialty. We enhance gore, fix a location by changing the words on a sign, get rid of gaffes like wires and rigging and boom mics, paint out reflections and shadows. All that stuff is par for the course, but it needs to be done well, and if you do it right nobody knows you did it. That has become a natural extension of the DI process. Frozen is the best example of this, and maybe one of the very first of a new wave of workflows for film.

What’s changed now is the DI tools. A lot of visual effects that you used to have to do in a Flame and a Shake, we can take care of with our DI toolkit. As long as your colorist is not intimidated by that, it comes down to a simple ROI. A given shot might take me 15 to 20 minutes in a DI system, which is fast and powerful. Or I might have a VFX artist spend all day on that. If it’s that kind of equation, it’s a no-brainer do do it in the DI system. Even though the colorist talent is a lot more expensive – one hour of DI time is close to the price you pay for a full day, or a half-a-day, for a VFX artist – if you’re at least breaking even in terms of your cost, it’s a better way to work. You can get the results instantaneously. You can approve it creatively and get the shot done and move on to another one and go to sleep knowing the shot is finished.

Eliot Milbourne: We’re sitting here in a Truelight calibrated theater with a client, doing those shots, and everybody can see them and sign off on them and feel good about how they look on a big screen. Traditionally, you have a VFX artist do it and bring it back [to the client] and hope it’s OK.

In Frozen, a lot of time was spent on adding snow effects. We’re adding up to six different plates of snow – different types of snow falling at different intensities. The snow becomes a performance. And when we layer all these things up, you may decide to change the skin of the main actor – do a window or shape or change the exposure of him to be more visible. These are all interactive elements, so it seems natural to do it in one place.

TH: When we look at shots that seem to be a visual effect, we have to make a judgment call. This shot we should do in the DI suite because it will be faster, better, and cheaper. Something about this other shot really needs a hardcore VFX-compositing package to do it just right. And then there’s still a third type of shot that we’ll start in the desktop tools and bring it into the DI suite when it gets to a certain milestone. As far as the director is concerned, he’s still getting most of that new experience.

So as the VFX supervisor on Frozen, how did you work with the director to arrive at this kind of workflow?

The director’s name is Adam Green, and the DP is Will Barratt. I worked with these guys several times before on Spiral, Hatchet, and some shorts. The premise of Frozen is three friends get stuck on a ski lift and have to figure out how to survive with a blizzard coming. There was some talk that I should go along to the shoot in Utah and do some on-set supervision, but they thought, “There really isn’t any VFX work in this movie – maybe a few wires to paint out when people are climbing on cables.” Now we’ve got about 270 VFX shots in the film. That’s a lot for a film where they thought they’d only have a handful.

The script called for some severe weather, and they had real severe weather out there. The problem is, severe weather comes, hits in a flurry, and goes away. And since most of the movie took place at night, it just doesn’t show up. They’re shooting actors 50 feet up in the air on a ski lift, so they’ve gone to all these lengths to make it look real, and the irony is that it’s really blizzarding and you can’t see it. It looks like they’re on a black stage. I was getting phone calls from the set saying, “This is going to be a problem.” Someone suggested we build a particle system for CG  snow. But this isn’t The Day After Tomorrow. We didn’t have $80 million to spend to make it snow throughout the movie. We had to think of something. The new workflow with the Film Master made it possible.

Did you know you’d be able to pull it off?

In my head, I knew it made sense. But, honestly, I was a little bit afraid it might not turn out at a Hollywood level of cinematic quality – I was worried we’d make something that was a an improvement to the film, but maybe only TV quality instead of film quality. That was my biggest concern.

We counted the shots. There turned out to be 130 shots where we needed to make it snow, make it hail, make it blizzard, make it windy, or some combination of the above. That represented close to 20 minutes out of the film.

So our DI system allows us to create multiple layers of footage and composite it together. We already use those tools, and we can isolate a person’s face with a shape that tracks them. We decided, let’s use the same tools to control where the snow can and cannot go, and to adjust the transparency of the snow, speedups and slowdowns, and de-focusing. I realized we could take multiple layers of snow and build them up in the system, adjusting transparency, color, density, focus, and speed. We could layer it up to create something convincing.

F&V: Did you use any live-action plates?

TH: We shot some snow-machine snow in front of black backgrounds. And, because the DI system is so powerful, my hope was we could actualy sit here with the director and, in a few minutes, go through a shot and build up exactly the “performance” he wanted. We did 130 shots in three and a half days. That’s an order of magnitude more than you could do with a VFX artist. That was either me or Eliot working at different times on one system. If you loaded up one compositing artist with that, it would be more like three and a half weeks. Irrespective of budget, this was the right approach. They could have had a lot more money, and this still would have been the best thing they could do.

On other shots, we started with traditional VFX. Some people in the film get eaten by wolves, and they needed more gore. On another scene, somebody’s hands get cut up and we needed to add more gore. [VFX  artist] Christopher Grandel did all the hard work on the desktop, getting the gore in place and making the blood run. Then he gave it to us to cut into the system with alpha channels and things. The director would look at the effect and think it wasn’t quite there. And we knew what wasn’t quite there was the color and texture of things. We could continue Christopher’s work where he left off. We would adjust the intensity, color and texture of the effect. You could do it in a few moments and the director would be sold on it.

F&V: How does the decision to move that shot into the DI get made, budget-wise?

TH: On VFX, you’re flat-rating a lot of these things. There’s a certain budget for a shot and it becomes really difficult to go back to the producers and say, “The director wants this to look better and we need more money.” But we don’t ever want a shot to go out without looking perfect. We can make everyone happy with it in a few minutes here [in the DI  suite] rather than going back to the artist to spend a day doing another iteration.

EM: A lot of people doing dirt removal and grain reduction use several machines working on a tape-to-tape basis. We did all that work in a controlled way on the DI master. We could completely dust-bust and get rid of scratches and things using the Film Master’s advanced toolset. When we change exposure radically, the grain of the film becomes a lot more apparent. We were able to use advanced motion-estimation algorithms to calm down that film grain and also add sharpness on certain key shots. And the last thing we had on this system was Gen Arts Sapphire plug-ins. They wanted to make one actor’s POV shot seem a bit more strange, so we added camera shake and vignetting.

TH: In one scene, the chair lift falls off the cable at the top. That was a combination of practical VFX work and then DI work. We went and got the real cable they used to hold up that lift, stretched and snapped it, and recorded that on a Red camera, Christopher Grandel took that footage, composited it in, and had the cable snapping on screen. Then it was a matter of coloring it in the DI  to get it to blend into the environment. Then, the scene didn’t feel as tense as everyone had imagined. It didn’t have a feeling of panic. It was really simple for us to use the Gen Arts Sapphire plug-ins to create a camera shake – a closeup of an actress’ eyes, a reverse on the cable stretching and snapping, and then the chair lift is falling. Camera shake itself is nothing new, but it’s one of those things where being able to do it in the DI suite meant someone could make the suggestion.

If you’re in a DI suite, looking at things on the big screen, you’re discovering all kinds of new problems and opportunities. You’re not going to take advantage of them all. You’re making judgment calls. But the higher the barrier to entry, the more likely you’ll pass on the opportunity, and then the film won’t be as good as it could have been. In the DI, you’re more free to experiment and explore and do a little bit of trial and error. That’s another way it’s making movies better – giving directors the ability to explore. If you’re able to work faster and get more done, those moments of inspiration get magnified.