Workflow and Display Solutions Multiply as SMPTE Discusses Standards

It would be exaggerating to say that 2009 was the year of 3D at NAB. Sure, there was plenty of technology on display, but a good portion of it was just prototypes, partly because the all-important standards for home delivery of 3D content are still under review. But there’s no denying that visual stereophiles had a lot to look at.
On the eve of the show, the SMPTE 3D Home Entertainment Task Force announced its recommendations for mastering in 3D. SMPTE favors a 1920×1080 pixel resolution running at 60 fps per eye. That last bit is important – there have been proposals for including stereo elements in a single 1920×1080 HD frame, using interlacing or stretching techniques to cram two pictures in at the expense of total resolution. A group within the SMPTE standards committees will take up the charge now, with work on defining a spec for the SMPTE 3D Home Master expected to begin in June and take less than a year to complete.

Once that spec is finished, it will be time to figure out how to transfer 3D content using technology like HDMI. “This will be the first step in what will become the long-awaited realization of good-quality 3D content viewing in the home,” said Warner Bros. Studios Senior VP of Technology Wendy Aylsworth, who also serves as SMPTE’s VP of engineering, in a prepared statement. (www.smpte.org)

That means 2010 might not quite be the year of 3D at NAB, either – but that will also depend on how James Cameron’s Avatar performs on its holiday release this year. That mega-budgeted potential blockbuster is widely considered to be a future bellwether for the public’s response to high-value stereo content.

Jumping the Gun?

Not everyone is waiting for SMPTE to finish work before wrangling 3D playback in the home. A company called Next3D used last month’s Game Developers Conference as a platform for announcing its own software for enabling 3D movie playback on Xbox 360 and PlayStation 3 game consoles connected to 3D-ready TVs from Samsung and Mitsubishi with LCD shutter glasses from companies including Real D, NuVision, eDimensional and Samsung. Next3D is encoding stereo images using Multiview Video Coding, an amendment to the H.264 spec approved last July. “We didn’t invent the spec,” Next3D says in a statement posted at its Web site. “We’ve simply decided that the 3D format wars are over and H.264 won.” The company also plans to make a player available for Windows and Mac OS X, but it seems more likely that consumers would have game systems connected to a large, 3D-ready TV. As with any company that promises it will provide all things to all people, it’s probably best to take Next3D’s claims with a grain of salt. (www.next3d.com)

Panasonic’s HD  Theater

Panasonic wowed NAB attendees with a special HD theater presentation that included a look at the trailer for Pixar’s Up in 3D as well as an even more astonishing glimpse of stereoscopic footage from the opening ceremonies of last year’s Beijing Olympics. Under glass, Panasonic showed a conceptual model of a duck-billed HD 3D camera with a double-lens unit as part of its under-development “3D Full HD production system” announced at the show. A 3D Drive System enables large-screen stereo viewing at full HD resolution. And, in February, Panasonic opened the Advanced Authoring Center at the Panasonic Hollywood Laboratory to work on developing 3D Blu-ray Disc titles with Hollywood studios. The company’s main message on 3D seemed to be that it requires two full-resolution 1920×1080 images, not a methodology that fits two images into one HD stream using image-squeezing (encoding both the left eye and right eye side-by-side in one HD frame) or interlacing (interleaving the left eye and right eye in alternating horizontal lines) techniques. (www.panasonic.com)

JVC  Does 3D  in  Real Time

JVC was showing the 46-inch GD-463D10 LCD HD monitor, which supports both side-by-side and line-by-line 3D schemes and sells for around $7,000. There was also a tech demonstration of a system for converting 2D images to 3D in real time. As you’d expect, there were many glitches, but it was still impressive. As Samuel Johnson once explained in a somewhat different context, “It is not done well, but you are surprised to find it done at all.” JVC’s Dave Walton says the technology is currently being licensed to Sensio. JVC and Sensio are reportedly co-developing a high-end projection system for consumer applications. (www.jvc.com)

Avid Puts 3D  in the Cutting Room

On the workflow side, Avid added 3D capabilities to Media Composer, describing 3D editing as similar to the old “workprint” paradigm. You edit in 2D space until you have something you feel is worth taking a good look at, and then you preview it in 3D in the cutting room. Avid secured a money quote from Aaron Brock, assistant editor on Jonas Brothers: The 3D Concert Experience: “If we could use this new feature, it would save the studio an unbelievable amount of money,” he said in a prepared statement. “Instead of having the online system running for five months straight, we could just conform complete reels for full screenings like on a normal feature. It would be much more efficient than doing conforms on a daily basis.”

Avid’s Michael Phillips estimated that some 40-100 3D projects are in production right now. Phillips speculated that 3D may find a position in the marketplace that’s sort of like the 2.40 widescreen aspect ratio – not exactly a dominant format, but a viable and valuable creative choice for filmmakers who see it making the most sense for their own stories. (www.avid.com)

More Manipulation

CineForm was touting an impressive technology that the company’s co-founder and CEO David Taylor called “active metadata” – basically an engine that allows the equivalent of Photoshop adjustment layers in the context of video. The metadata is sent through the workflow process along with the video footage. By making savvy use of that metadata, CineForm has enabled powerful adjustment tools for 3D images in its First Light 3D software. Both eyes can be color-corrected at the same time, or individual eyes can be tweaked to ensure a proper match.

Most interestingly, users can enter “onion skin” mode, where they see both eyes at a 50 percent opacity, allowing the relative positioning of the two images to be easily manipulated – which has the effect of tweaking convergence horizontally, vertically, or rotationally in the 3D image. The booth demo seemed to prove that those adjustments can be made quickly and painlessly in the context of a CineForm-aware workflow, which could significantly reduce the costs of a 3D conform.

Taylor demonstrated a Final Cut Pro workflow where the video file contains the left eye at full HD quality. But the right eye is stored as a second, synchronized HD track in the active metadata associated with that file. That means software that’s not aware of CineForm’s workflow will just see and/or show the left eye, without choking on the extra information. But Final Cut Pro – and other DirectShow and QuickTime apps from Adobe, Autodesk, Sony and others – will interpret the stereo file properly. The software supports playback in a variety of 3D modes, including side-by-side, interlaced, over-under, and old-fashioned anaglyphic. (Yes, that means red-and-blue glasses. That’s not the best way to check your work, but it will give you a quick and dirty sense of how much and what kind of depth is in your image.) CineForm recommends using an AJA Kona card for real-time playback and monitoring with the Neo3D Final Cut Pro workflow. (www.cineform.com)