SMPTE Conference's Hit List of Digital Cinema Technology

At this year's SMPTE Technical Conference and Exhibition in Hollywood, a technical session on “Image Acquisition,” chaired by Panasonic Broadcast product engineering manager Bill Hogan, looked at six very different topics related to digital camera technology. Hogan noted that digital image capture for big-budget feature films is doubling every year, and that the testing of digital cameras is a hot topic. “As digital capture of moving images has increased and matured,” he said, “interest has also increased in issues of quality and necessary specifications.” Here's a round-up of the technology covered in that afternoon session.
Pixels and Photosites: The Apples and Oranges of Camera Specifications?
Thomson Grass Valley Manager of Advanced Technology David Bancroft examined the difficulties of comparing artifact-free resolution between three-sensor and single-sensor cameras. “This presentation was inspired by a real-world case of a cinematographer wanting to do a project and having some challenges with descriptions of pixels in cameras, and translating them into the pixels he was contracted to deliver,” said Bancroft, who noted he was focusing on resolution. “We still have plenty of really good cameras with multi-sensor and splitter arrangements, but we also have single-sensor cameras with a lot of complexities. We still want a simple, common unit for comparison, but we find that the results we get don’t always meet our expectations. More Ks in the cameras don’t always equal the Ks delivered.”

Bancroft proposed a way to compare “true resolution” among widely different camera designs.  “The word pixel has an agreed-upon meaning in file and stream delivery formats,” he said. “But, with increasing variety in [camera] architecture, continued use of the word can cause misunderstandings. Using the word photosite in specifications would help end users.

“Complexity in today’s sensors means remembering the underlying principles of sampling, aliasing and filtering,” he continued. “You may have to use these to determine how many pixels you’re really getting from your camera. All camera architecture employs trade-offs between resolution, aliasing and sensitivity, and none is immune to this rule.”

The Esmeralda Stage: An Analytical Test Laboratory
Jonathan Erland, founder of Composite Components Company, described the history of the Esmeralda Stage, a lab with a highly controlled, repeatable means to test the components of the cinematographic imaging process, including photochemical and digital camera systems, film stocks, lighting apparatus and filters.

The Esmeralda Stage, recounted Erland, is based on the Laboratory Aim Density frame created by the late John Pytlak of Eastman Kodak, and was further modeled on a multi-plane matte painting stage. That provided the ability to image large-scale flat test targets such as a Macbeth, D.S.C., ISO 12233, and 3D targets including color difference traveling matte backings as well as motion controlled targets for imaging motion streak and blur. “When Eastman Kodak introduced 5294, high-speed color negative, it was quite disastrous for blue-screen photography,” he said. “We had to test the problem and present it to Eastman Kodak to convince them to produce a film stock that could read the blue screen.”

As a result of the early work on the Esmeralda stage, Kodak introduced 5295, a blue-screen compatible stock. Kodak later introduced “T” grain with the 5296 stock, which was quickly adapted by the visual-effects market and cinematographers. But this stock created the problem of high-speed emulsion stress syndrome. At that time, Erland took over responsibility for the Esmeralda Stage, and he and his wife Kay founded Composite Components. With a project team led by Bill Taylor and including Jim Danforth, Ray Feeney, LeRoy DeMarsh, Phil Feiner and Bill Hogan, the actual Esmeralda Stage was hosted at various visual effects facilities.

In 2004, Esmeralda moved to the Pickford Center. It’s now redesigned as a free-standing structure employing speed-rail pipe. Erland is slated to receive a Scientific Technical award of commendation for “his leadership and efforts toward identifying and solving the problem of High-Speed Emulsion Stress Syndrome.”

Arri's Mscope: Anamorphic Capture in HD
Milan Krsljanin, business development manager at Arri, introduced Arri’s new Mscope anamorphic digital acquisition mode for the D-21 camera, which takes advantage of the real-time stereo ingest of the Quantel Pablo to reassemble two Mscope media streams. “Mscope is a new and unique feature of our Arriflex D-21, which combines for the first time the use of anamorphic lenses with the economy of HD acquisition,” he said.

“Cinematographers love to capture with anamorphic lenses, not just because of the ratio but how the anamorphic lenses deal with space, sharpness, backgrounds and flaring,” he added. “But up until now, that was a difficult task for the simple reason that the majority of the digital cameras have a 16×9 sensor. If you wish to shoot ‘scope, you have to shoot … and then crop it [to the desired aspect ratio]. In some cases, people do a kind of half-squeeze, but it doesn’t allow them to use the lenses they want. It’s a half-baked solution.”

The Arri D-21 uses a 4×3 sensor shaped like a frame of 35mm film. The question for digital cinematography has always been how to deal with an anamorphic image squeezed into that space without losing horizontal and/or vertical resolution. The best solution, Krsljanin averred, was to sub-sample all the even lines in one HD stream and all the odd lines in another HD stream. “When recombined in post, it gives a perfect anamorphic image,” he says. “The benefit is that each of these two streams gives a completely viable and recordable image. So you start with a 4×3 sensor, divide the image into two streams, each 1920×1080, with 720 active lines. Starting with 1920×1440 in the HD domain, that gives you two streams in post, and you end up with 3840×1440 after de-squeezing the image.”

In post-production, he continued, the Quantel 3D stereoscopic system is able to combine the two streams. “Post-production is exactly the same as if it were captured in film,” he said. “It works at 24, 25, 30 fps. You record it live, straight from the camera, combine it in the DI and you have an anamorphic image. It’s not rocket science.” The benefit is that the process actually adds approximately 800 lines of resolution compared to the equivalent 2.40:1 ‘scope images derived from ordinary 16×9 HD. “Plus, with M-Scope, the production is shooting with anamorphic lenses, which gives very different aesthetics, particularly in terms of depth of field, the out-of-focus backgrounds look and incidental light flares, while benefiting from HD’s cost effectiveness and flexibility.”

Diagram

For more information:  www.arri.de/prod/cam/mscope/details.html

Compatibility of 48 and 24Hz Content-A Problem and a Solution
Moving Image Technologies VP of Engineering David Richards addressed the 48 Hz frame rate that was included as a “supported option” in the digital cinema specifications. The 48 Hz frame rate would be a way of providing an “enhanced” theater experience, but the problem remains that footage acquired at 48 Hz cannot easily be converted to 24 Hz. “Demonstrations have shown strobing artifacts become objectionable because of the effective reduction in the shutter angle of the camera during capture,” said Richards, who pointed out that a 180-degree shutter at 48 fps is equivalent to a 90-degree shutter opening at 24 fps. The prospect that a production would have to produce unique 48 Hz and 24 Hz masters has proven to be an obstacle to anyone adopting 48 Hz production. “We need a 24 fps version of anything shot in 48 fps,” he said.

The solution is to widen the exposure of digital cameras to 360 degrees, he said. “Digital cameras are now available that can capture images up to and including a 360-degree capture, or approximately 21 ms for a 48 Hz capture rate,” he said. “The 21 ms capture time represents exactly the same period as traditionally captured with a 24 fps camera with a 180-degree shutter, so when alternate frames of the 48 Hz content are extracted and played at 24 Hz, the captured images show the same motion blur as they would at 24 fps.”

To demonstrate, he showed some test material shot by cinematographer Robert Primes, ASC, at Panavision with the support of Panavision’s Nolan Murdock.

Digital Dailies in the 21st Century
Technicolor Creative Bridge’s Brian Gaffney described the history of dailies and the consistency of the term’s definition over the last 100 years-and how the role of dailies is changing dramatically. “The definition has changed because production is happening across the globe,” he said. “In the past, you would watch dailies in controlled environments, such as a lab. Now, you might be in a remote area with no secure place to watch dailies.” Gaffney pointed out that cost has also changed dailies, since many studios don’t want to pay for film prints. The number of people and departments viewing dailies has also dramatically expanded. He also discussed the new demand to look at raw images from such cameras as Red and the Viper on set, with on-set color-correction giving the filmmakers a leg up on the DI process.

Also new are interactive dailies, which are beginning to happen in the commercial production world. “You send them to clients who draw circles or add comments and send them back and forth,” said Gaffney, who also noted the move to Blu-ray dailies and, at the other end of the spectrum, desktop dailies.

Capturing Lunar Footage and Utilization of Image Data-The HDTV Camera System Onboard the Lunar Explorer Kaguya
The SMPTE audience was treated to an amazing display of HD video captured of the lunar surface and the earth rising and setting, via an HDTV camera onboard the Japanese lunar explorer, as presented by NHK’s Seiji Mitsuhashi. The HD camera experiment both allowed the crew to capture images of steep angles, for analysis of the walls of large craters. Also studied was the damage to the HDTV camera caused by space radiation, as well as the relation between the amount of damage and the location and position of the lunar explorer.

For images, visit www.jaxa.jp/video/index_e.html