"Let’s see what it looks like if we make this scene a little warmer,"
John Schwartzman, ASC, says to senior digital colorist Stephen
Nakamura. If Nakamura were a timer at a film lab, Schwartzman would
probably be specifying printer lights, an objective term with no room
for interpretation. This process is currently more ethereal. The
colorist pushes a few buttons and the digital images projected on the
wide screen become slightly warmer. "A little more yellow," Schwartzman
says. A few more buttons, and then, "That’s it."
The cinematographer was recently timing Seabiscuit in a digital suite
at Technique, in Burbank, CA. The digital intermediate method is
advanced technology, but there’s still plenty of subjectivity.
Schwartzman explains that the digital images projected on the screen
aren’t an absolute match for the film that will be recorded out at the
end of the process.
"The film will look better," he says, explaining that there are nuances
in colors, textures and tones that currently can’t be replicated with
digital projectors. He and Nakamura are making that interpolation in
their minds. Schwartzman also occasionally asks for a scene to be
recorded out onto film, often followed by a little more tweaking.
Digital intermediate technology, in a nutshell, is the process that
allows feature films to be scanned and converted to digital files that
are manipulated in ways that aren’t practical or even possible with
traditional optical techniques.
Cinematographers can create unique looks to accentuate the emotional
content of shots, scenes or entire films. It’s not unlike using ENR or
other proprietary bleach-bypass processes offered by different film
labs.
Cinematographers can also isolate anything in any frame, including
faces, places and objects. Maybe they want the colorist to isolate the
sky and make it a little darker to visually punctuate a mood or to
ensure continuity with elements of the same scene photographed on
cloudier days. Perhaps they’ll ask the colorist to isolate a face and
make someone’s skin tones subtly softer or to put a gleam in someone’s
eyes. It’s a little like watching them paint by remote control.
The truth is that this is still the dawn of the age of digital
intermediate technology. The concept has its roots in the invention of
the Rank-Cintel telecine, which first showed up in North American
post-production facilities during the late 1970s. The telecine combined
advanced flying-spot scanner technology with a gentle drive that made
producers feel more comfortable with the notion of using the original
negative rather than a print or interpositive, which was the common
practice with optical film chains.
Those two breakthroughs resulted in a quantum leap forward in the
quality of film images displayed on TV screens. Suddenly, nuances on
the negative were discernable to audiences. It didn’t take long for
cinematographers to experiment with the possibilities. In the early
1980s, Daniel Pearl was the go-to guy for horror films in the wake of
his innovative work on The Texas Chainsaw Massacre. He was contacted by
Russell Mulcahy, an Australian director who asked Pearl if he was
interested working on a music video.
Mulcahy told Pearl he needed 65 setups of Supertramp performances. The
video was called " It’s Raining Again." He told Pearl to light as
artfully as possible in the available time-but not to worry, because he
could fix anything in telecine. That was a radical concept. Within a
few years, Pearl was routinely photographing the unphotographable. His
experiments included using DayGlo paint makeup combined with
low-intensity black light to record images that could be enhanced in
telecine.
ONGOING SERIES
Look for The Intermediators, a series focusing on how working in HD and
high-res data is changing the job descriptions and working
relationships among DPs editors, compositors, directors, colorists,
post supervisors and others, in upcoming issues. Find Pt. 1 in 6/03
Film & Video.
By the mid-1990s, almost all dramatic film production, commercials and
music videos produced for television were being timed and post-produced
in telecine suites. The first breakthrough in the feature film realm
occurred in 1998, when writer-producer-director Gary Ross explored new
territory during the production of Pleasantville. Ross conceived and
wrote the script before he figured out how to produce the film. In the
opening scene, two contemporary teenagers watching a 1950s vintage
black-and-white TV program are magically zapped into that world with
splashes of color occasionally used to make story points. Ross
considered shooting in black and white, but the negative was too grainy
for his taste.
Visual effects supervisor Chris Watts and color effects designer
Michael Southard suggested an alternative: Shoot color film, scan it
into digital format and desaturate the images. They explained that the
digital files could be recorded directly onto 35mm color intermediate
film used as a master for making release prints. Kodak had introduced
Lightning film scanners and recorders in 1991, but Watts and Southard
calculated that it would have been prohibitively costly to scan and
record the original film at its full 4K resolution. As an alternative,
they scanned the film at 2K using a Thomson Spirit DataCine coupled
with special look-up tables developed by Kodak scientists.
Cinematographer John Lindley, ASC, orchestrated lighting, camera
movement and composition that established the basic visual grammar. He
also advised Southard on how much to desaturate the images without, he
said, "taking all of the oxygen out of the film." Lindley made a
prescient observation: "There is a new player in town who can affect
contrast, brightness and all the things that the cinematographer
normally controls. I was blessed that Chris has a creative aesthetic
and he respected my work."
" Ethan and Joel favored a dry, dusty Delta look with golden sunsets.
They wanted it to look like an old, hand-tinted picture with the
intensity of colors dictated by the story, and natural skin tones."
It was a road map to the future. The next milestone came two years
later, when Roger Deakins, ASC, BSC, took a painterly approach to
finishing O Brother, Where Art Thou? in a digital suite at Cinesite in
Hollywood. It was his fifth collaboration with the
writer/producer/director team of Joel and Ethan Coen. The story was set
in Mississippi during the 1930s. Deakins described it as a fable set in
a real-world background.
They were shooting in Mississippi during the summer with more than half
of the story slated to be daylight exteriors. Deakins knew from
experience that the foliage would be lush green. He tested a
bleach-bypass process, but it didn’t provide the flexibility needed for
him to selectively desaturate greens while creating the hand-tinted
postcard look that the Coen brothers wanted. They decided to time the
film digitally.
"Ethan and Joel favored a dry, dusty Delta look with golden sunsets.
They wanted it to look like an old, hand-tinted picture with the
intensity of colors dictated by the story, and natural skin tones."
It wasn’t a totally smooth ride. Deakins spent some ten weeks at
Cinesite fine-tuning the film’s look after the negative was locked down
and converted to digital files. It was a learning process. Deakins
discovered that if he tried to do too much image manipulation, it
created noise and electronic artifacts that would have been
distracting. In the end, he settled for mainly toning down the greens
and playing with the overall saturation.
Despite the technical limitations, O Brother, Where Art Thou? was a hit
both at the box office and aesthetically. Some 20 to 25 U.S. theatrical
films have subsequently been ushered through digital intermediate
processes, mainly at Efilm (a joint venture by Panavision and Deluxe
Labs), Technique, and Cinesite, though other labs and post facilities
are also entering the field.
There have been some bumps in the road. Traditionally, cinematographers
move on to the next project after completing original photography.
Final color timing at film labs is typically done over a weekend or
two, mainly for the purpose of ensuring shot-to-shot and scene-to-scene
continuity. They work with the same color timers, picture after
picture, and have a very specific common language expressed in printer
points. Digital intermediate technology is currently more
time-consuming, and it is also more of an interactive, collaborative
process involving the colorist and cinematographer. That scenario
raises questions. What happens if the cinematographer is off shooting
his or her next film and isn’t available? Studios haven’t traditionally
paid cinematographers for timing films at labs. Is the more
time-consuming digital intermediate process going to change that
dynamic?
Those and other questions were fodder for discussion during a seminar,
filled to capacity, at the recent Cine Gear Expo conference on the
Universal Studios lot in Los Angeles. Panelists included Russell
Carpenter, ASC (Charlie’s Angels), Steven Poster, ASC (Stuart Little
2
), Denis Lenoir, ASC, AFC (Demonlover), Alar Kivilo, ASC, CSC (Hart’s War), and Deakins (Intolerable Cruelty), with Richard Crudo,
president of the American Society of Cinematographers, moderating.
Poster reported that he recently tested new color management software
that could provide a more subjective way for cinematographers and
colorists to communicate.
"That has to be the way of the future," Crudo observed. "It is
important for cinematographers to be in control of image manipulation
in the digital suite because they are the authors of the look. It has
to be seen as an extension of our role."
" Kevin wanted a realistic look, and this story takes places at a time
and place where the only artificial light came from things like
campfires and lanterns. In a way, it was like using Photoshop to fine
tune still images."
Deakins said he was impressed by the progress that has been made since
he digitally timed O Brother. Intolerable Cruelty was another Coen
brothers film. This time, Deakins worked with colorist Steve Scott at
EFilm.
"We were looking at the images projected on a decent-sized screen," he
says, "and that makes a difference. It was a pretty good representation
of what was filmed out, although it’s not quite there yet. After I saw
the film-out, I did some re-timing and put a bit more contrast into the
highlights and increased the saturation a bit. Those are subtleties,
but I believe audiences instinctively interpret and react to those
visual clues."
Deakins estimates that he spent approximately two weeks in the digital
suite, but notes that it was a much less complex endeavor than O
Brother. Deakins would prefer to see images scanned at 4K resolution
and recorded back out onto negative film with an Estar base rather than
color intermediate film. He believes that would foster a startling
improvement in the quality of images projected on cinema screens.
Open Range was James Muro’s first feature film credit as a
cinematographer. It was also his first experience with timing a film in
a digital mastering suite. Up until now, Muro has been a top-tier
camera and Steadicam operator. Open Range is a Western directed by and
starring Kevin Costner. It is a mainly exterior movie composed in
widescreen Super 35 format. It was a relatively low budget film, but
Muro says it was an obvious candidate for the digital intermediate
process.
They shot tests during preproduction and settled on Cinesite, where Muro collaborated with colorist Marc Weilage.
"You can’t take badly exposed film and make it look good," Muro
cautions, "but digital timing gives you incredible flexibility. In one
big exterior, we used Power Windows in four different places in one
shot. We darkened the sky a bit for continuity, and took a little
sunlight off a building on the left side of the frame. We could have
done that with some flags and grip work while we were shooting, but it
would have taken too long and we would have lost the sun. We also
brightened the highlights on characters on the right side of the frame,
because Kevin wanted to direct the audience’s attention to them."
Muro notes that there are about 110 visual effects shots in the film,
mainly simple things like wire and rig removal, so much of the negative
was going to be converted to digital format in any case. He was
budgeted to spend approximately 100 hours in the digital suite with
Weilage. Muro says it was an easy collaboration.
" Kevin wanted a realistic look, and this story takes places at a time
and place where the only artificial light came from things like
campfires and lanterns," Muro says. "In a way, it was like using
Photoshop to fine tune still images. I’d tell Marc what I wanted, he’d
make some suggestions, and we’d look at it together on the screen."
John Schwartzman was in Texas shooting The Rookie when he spent an
entire Saturday reading a best-selling non-fiction book written by
Laura Hillenbrand about a racehorse named Seabiscuit. "I began reading
it at 10 a.m. and couldn’t put it down until I finished at 3 a.m. on
Sunday," he says. "I told John Lee Hancock (who was directing The
Rookie) about it the next day and suggested that he get movie rights.
It was too late. Gary Ross bought the movie rights three years earlier
when the story appeared in a magazine."
About three weeks after The Rookie opened, Ross called Schwartzman and
asked if he was interested in collaborating with him on Seabiscuit. It
was pure serendipity. Ross had seen The Rookie, and he liked the
naturalistic look. They clicked right away.
"Gary is a fantastic screenwriter," Schwartzman says. "His script was
a beautiful adaptation of the book. It is very emotional without being
overly sentimental. We prepared by spending twelve weeks together,
eight hours a day, talking about the script and making shot lists that
filled a 250-page notebook describing what every scene felt like."
They made a quick and easy decision that Seabiscuit had to be composed
in wide-screen format, partially because that perfectly matched the
dimensions of horses, and also because the story cried out for
cinematic scope images. The main question was whether they should shoot
in anamorphic or Super 35 format with spherical lenses. There wasn’t an
obvious answer. Schwartzman wanted to use spherical lenses, partially
because they are faster and would allow him to compose with wider
angles. The problem was that the Super 35 format requires an extra
optical step at the lab for "squeezing" the images on the negative into
a 2.4:1 aspect ratio. In the digital suite, the analog images are
converted to digital files, which are squeezed into wide-screen format
in the computer rather than optically.
Schwartzman and Ross decided to shoot a series of tests with five
racehorses. They ran the Super 35 test footage through digital
intermediate processes at Cinesite, EFilm and Technique, and also made
an optical "squeeze" at Technicolor. They preferred the image quality
of all three digital processes to the optical test.
"All three facilities did a wonderful job with impressive picture
quality," Schwartzman says. "I wasn’t comparing them. We just wanted a
sense of the possibilities and what it was like working with different
colorists. Based on that test, we decided to produce Seabiscuit in
Super 35 format, and to create the digital master at Technique. Release
printing was done by Technicolor."
It was Schwartzman’s first experience with timing a motion picture in a
digital suite. However, to put that in perspective, he has spent
literally hundreds of hours in telecine suites fine-tuning looks for
some 300 music videos and many commercials.
Nakamura used the printer lights on Schwartzman’s timed film dailies as
a visual reference. Schwartzman says he felt that he was "98 percent
there" during his first sessions with the colorist. "I felt like I was
more of a participant than I am while timing at a film lab because it
is so much more of an interactive process," he says. "You can tell the
colorist to make small adjustments and see them right away and compare
slightly different takes."
Schwartzman estimates that it took approximately five days timing the
film digitally. "There has been a lot of progress in the look-up tables
and other software since Gary did Pleasantville," he says. "I could do
things that aren’t possible photochemically. For example, there are
shots where I made changes in both the highlights and shadows. I
consider Stephen a film timer who happens to work with computers."
"This is just the beginning," Schwartzman declares. "I didn’t have any
problems working at 2K on this project, though I am sure that 4K is in
our future as scanners and recorders become faster and the pipelines
become more robust. I think this makes the idea of shooting in Super 35
format a more viable option. Now, it comes down to a question of which
are the right lenses- anamorphic or spherical- for your story."