Put Your Footage on the Road, in the Cloud, and on Your iPad
The 2011 NAB show floor was more eclectic than ever, hosting demos of single-body stereo 3D cameras and bleeding-edge DI tools alongside sexy glimpses of new Thunderbolt storage devices, iPad dailies and color-correction, and, well, that makeshift tattoo booth outside the big Red tent. But if there was any overarching theme this year, it might be mobility. From processing technology that makes it possible to do more inside the GPU to new digital interfaces that make it easy to access massive storage and bandwidth from a svelte notebook computer, NAB was full of solutions that greatly expanded the convenience and flexibility of production and post workflows.
Outfitting the Digital Lab
Certainly the pressure is on post houses to become more agile, with productions demanding access to a “digital lab” that can be packed up and deployed on location to enable super-fast dailies turnaround and let editors get a head start on assembling a cut while cinematographers set their initial looks. Remote Control Dailies, a new system from MTI Films that supports essentially all camera formats as well as DNxHD, ProRes, and H.264 deliverables, is a model of this approach. Unpacked, it fits in a corner of a hotel room, bringing 24 TB of storage and LTO-5 archiving capabilities on the road. MTI’s recently promoted VP David McClure told us more.
Light Iron Digital, which helped engineer the post workflow for The Social Network last year, was getting a lot of attention for its LiVE PLAY system, which streams dailies from the free LiVE PLAY server to an iPad for the price of a $20 app. The app includes an upsell for a collaborative version, allowing users to share metadata as they review dailies, and an in-development version of the technology would allow dailies to be securely downloaded to an iPad that could then be used, for example, to drive projection in a screening room. Like Image Control, a new color-correction app for iOS from Gamma & Density and Synthetic Aperture, LiVE PLAY is awaiting approval before it shows up in the iOS App Store.
Red’s Ted Schilowitz demoed Epic’s low-light performance for an audience at the Adobe booth.
Living in Cloud City
A lot of the action at NAB this year was in the clouds – by leveraging cloud-based storage and computing, we were told, we could edit, archive, transcode, and deliver just about any kind of digital content we could imagine. Microsoft was at the show, wrangling technology partners to help leverage its Windows Azure cloud application platform – they include Digital Rapids, Harris Broadcast Communications, Polycom Video Content Management, and Signiant, to name a few.
These cowboys were among the camera models in the Grass Valley booth.
Other companies had their own cloud-based initiatives. One of the most promising is Quantel’s QTube system, which boasts frame-accurate editing and collaboration over IP. Users can browse cloud-stored content through Silverlight-enabled browsers and manipulate it through Quantel’s standard interface for broadcast editing. Avid showed its new Interplay Central, including a mobile app for BlackBerrys that lets reporters and producers work up stories on the go. And Grass Valley’s STRATUS collaborative-workflow platform has a web-access component enabling remote capabilities that are at least partly cloudy.
Thunderbolt and Enlightening
Mobile workstations became a little more powerful when Apple made the unilateral move to build the new Thunderbolt interface into its latest refresh of the MacBook Pro line. (In terms of speed and efficiency, think of it as a PCI-Express jack on the side of your laptop.) Thunderbolt peripherals aren’t shipping yet, but here’s a rundown of what was on display.
Cameras in Depth
Even stereo 3D is becoming manageable in a more mobile form factor. JVC showed the cutest all-in-one stereo camcorder yet, the $2500 GY-HMZ1, and Sony countered with the $3400 NXCAM HXR-NX3D1U. Here’s a quick look at new stereo camcorders from JVC, Sony, and Panasonic.
Panasonic said its AK-HC1800 box camera, which shoots at 24p, is being used on stereo rigs.
Stereo 3D Grows Up
It doesn’t exactly fit with the mobility theme, but stereo 3D technology in post may be evolving even more rapidly than it is in acquisition. I had back-to-back meetings with SGO and Autodesk that were fairly mind-blowing. SGO’s Mistika DI/grading system [SGO Mistika website] has a hugely impressive stereo feature set that includes the ability to automatically correct images for vertical disparity and color balance, and even to retroactively change the interaxial distance. (It does this by using complicated algorithms that figure out how to manipulate different areas of the picture according to which part of the depth field they occupy.) It even offers options for manipulating depth in an image, allocating more Z-space to the most important area of a scene while compressing the depth of elements in the foreground and background.
Autodesk’s impressively blue NAB booth.
Meanwhile, Autodesk was talking up two different techniques that involve the use of 3D compositing in the finishing process: relighting and redimensionalization. Everyone knows about “relighting” in the DI, which refers generally to the process of using 2D masks and shapes to define areas where brightness, contrast, etc., can be manipulated to create the impression of more or less light. Autodesk is suggesting instead that interactive lighting effects be created inside of a 3D composite (using a tool like the recently launched Flame Premium 2012, which combines real-time grading, editing and finishing alongside VFX features [Autodesk Flame Premium website]), creating more realistically stylized looks than you can get working with flat images. And if you’re working on a stereo project, you can actually embed those lighting effects in the 3D scene.
Redimensionalization takes the idea one step further, involving the mapping of 2D images onto real 3D geometry, creating a full-fledged 3D expansion of a 2D scene. Adjust your lighting accordingly in that 3D space, render back out to a 2D image, and you find yourself with the godlike ability to dramatically reconfigure a live-action lighting situation long after the scene has been shot, but before the DI is complete. (Don’t tell the cinematographer.) This process is, naturally, quite demanding in terms of person-hours. And it points up the dilemma facing Hollywood in the age of stereo 3D.
The tools exist to create great 3D, and even great 2D-to-3D conversions. The will exists among artists, who are continually striving to create something new and beautiful. But what about the money? Well, that’s how we ended up with largely derided 3D conversion jobs like last year’s Clash of the Titans. Judging from the technology on display at NAB this year, when it comes to getting 3D right, the budget might be the only thing that doesn’t exist yet.
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.