Star Wars' John Knoll on Using the Force of Next-Gen High-Def

It’s no secret that director George Lucas is the most vocal – and most successful – advocate of fully digital feature production. Working for him is one of the most challenging jobs a visual effects supervisor or an engineer can take on. For the two most recent Star Wars films, that task has fallen to Industrial Light & Magic’s John Knoll, a digital guru whose resume includes a stint working on The Abyss at ILM at the same time he and his brother Tom were creating a digital image manipulation program for the masses known simply as Photoshop. While the air is “thin” up there at the Ranch, Knoll clearly has an eye out for the working man. He sat down this spring to talk about what anyone planning an HD VFX job needs to know before they start shooting.
The Star Wars prequels have been an adventure in color space from the
start. While Episode I was shot in film, Lucas snuck in a very short
scene shot in film. The second of the series put the early 24p HDCAM
CineAlta cameras from Sony through their paces. With Revenge of the
Sith
the crew took a leap into a much richer color space with the new
generation of Sony RGB recording.
That digital-imaging experience served Knoll well as he navigated the
technological thicket surrounding post on Star Wars Episode III:
Revenge of the Sith
. "On Episode II, we used the first-generation [Sony] CineAlta cameras, which worked well, but we had to be careful of
an overexposure characteristic," says Knoll. He explains that because
the camera had a quick fall-off at the top of the exposure, shooting
brightly colored objects could result in color banding rather than a
smooth transition from color to white.
"David Tattersall, our DP, had worked with the cameras before we got
into principal photography and tailored his shooting style a bit," he
adds. "We got good images, but it was because we had a good DP shooting
them. When we went to III, almost every aspect of the HD experience
improved considerably."
That was particularly true in post, where pumping high-quality digital
images into a camera tells only half the story. For the pixel-pushers
on the visual effects crews, the format used in the tape deck tells the
rest. Episode III was shot using the latest generation of HD equipment:
Sony HDC-F950 cameras and Sony SRW-1 and SRW-5000 VTRs running 4:4:4
RGB using the SQ recording rate of 440 Mb/sec (with additional hard
disk recorders built by ILM). Compared to the earlier 4:2:2 format, the
SR 4:4:4 format made a significant difference for the ILM crew.
"We could push images further to increase contrast and brighten up a
shot," says Knoll, who supervised 1700 of the 2500 shots for Episode
III
. "If George wanted to blow a shot up, we had better images to begin
with." But, especially important to ILM, the move from 4:2:2 YUV to
4:4:4 RGB also translated directly into higher-quality blue-screen
extractions with less effort.
Green Screen Blues
"When so much of the movie is shot against blue screen or green screen,
we rely on color-difference matting techniques," says Knoll. That means
the more colors the better.
With the earlier equipment, RGB color from the camera was converted
into 4:2:2 YUV format when it was recorded. This format effectively
slices the color bandwidth in half because one color value represents
more than one pixel. The result is fewer chroma (color) samples than
luma (luminance). This chroma sub-sampling combined with spatial
sub-sampling effectively reduced HD’s 1920 resolution to 1440 for luma
and 960 for chroma, according to ILM HD Supervisor Fred Meyers.
"It’s based on science that says your eye isn’t as sensitive to color
transitions as to luminance," explains Meyers. "That’s valid, but it’s
not optimum for images recorded on tape that are further manipulated,
whether they’re used for compositing and visual effects, digital
intermediates and color-corrections, or for blowing an image up.
In bluescreen extractions, it’s the fine lines that matter. "Say an
actor with a light-colored flesh tone is in front of a blue screen,"
Knoll explains. "The flesh tone is mostly red and green with very
little blue in it. It has extremely high luminance and relatively low
saturation color. It’s immediately adjacent to a low-luminance
high-saturation color that’s on the far end of the color space. In
4:2:2, the luminance makes that transition in one pixel, but because
the chroma has been subsampled, the color needs two pixels. So trying
to get fine extractions for hair and thin, wispy objects without
getting a bit of a line was tricky. We got good results, but it was
more work than with a film scan."
The problem was exacerbated when the 4:2:2 YUV was converted back into
RGB. "When the color information which is at half resolution gets
reconstructed as RGB, you have to interpolate those values," says
Knoll. "There’s always a little round-off error." Furthermore, the
previous 4:2:2 recording formats used only 8 bits for color (and some
used 8 bits for luminance as well).
With the new HDCAM SR 4:4:4 RGB, however, color information is kept for
each pixel, all 1920 pixels across the image. The color stays RGB all
the way. And, the format stores color using 10 bits per channel,
allowing 1024 shades per color, not 8-bit’s paltry 256. That provides
more dynamic range for shadows and highlights. It makes bluescreen
extractions easier. And it means bandwidth-saving gamma encoding can
now compete with log in the quality race.
Gamma Raise
To be stored digitally, color must be encoded. CG uses linear
intensity, film uses log encoding, HD video uses gamma. "If someone
says they’re recording in video linear space, it’s a misuse of the
term," says Meyers. "What they mean is gamma."
Meyers explains that with CG, to make images convenient for use as
texture maps, color is stored using linear intensity. "It takes 16 bits
or more to represent what the eye might see in a scene – the brightness
off a car bumper, the darkness off a tree," he says. "Most people say
it takes more."
Thus, to represent information recorded on a film negative in less than
16 bits, studios use log encoding for film scans and to exchange
recorded files. 10-bit log, for example, is a widely used file
interchange format. "With log encoding, you can characterize a negative
from minimum to maximum density in a way that makes it possible to
match it throughout the film recording and printing process," says
Meyers. "But, with log encoding, a greater spread of bits is allocated
to shadows than to highlights. It’s film-centric, and it’s about
densities."
As might be expected, the earlier HD format with 8-bit gamma encoding
doesn’t always measure up to 10-bit log or 16-bit linear intensity. But
10-bit gamma does, according to Meyers. "Now that you can encode
material in gamma in 10 bits, you can record as much in the highlights
as in the shadows, which means you can manipulate either," he says.
Meyers believes that once people begin working with 10-bit gamma
encoding, they will see no reason to be limited to log encoding, which
is based on film recording.
"Film is now only one of the output formats," says Meyers. "HD, whether
digital cinema, broadcast, DVD or other digital media, no longer
benefits from film-centric log encoding."
And the advantages extend beyond the blue screen: "You have more
bandwidth and latitude in the overall image," says Meyers. "People are
taking a lot of liberties these days in color-correction, manipulating
the contrast, the saturation, and even the colors. Having the
additional resolution and bandwidth is an advantage any time you need
latitude to adjust the look of the image."