How Metadata is Key for OTT Video, When Disk Will Start Replacing Tape Archives, and Why We (Still) Can't Screen Silent Films Properly

SMPTE is always looking forward and backward — forward to new technology and backward to restore and archive existing, ever-more-valuable assets. SMPTE 2015 was no exception. The Wild West at SMPTE 2015 was over-the-top TV delivery, or OTT. 

“Metadata is the goal of OTT,” said Siemens Convergence Creators Steve Wong, who chaired the sessions on “OTT: New Frontier or Wild West.” In these sessions, several industry players offered case studies of their navigation in new waters.

Integrating Metadata with OTT Video
clearPrime Focus Technology VP Amer Saleem described his company’s primary product, Clear, a cloud MAM surrounded by tools for broadcast operations and distributions, that integrates metadata with live video.

First, he reeled off some of the numbers that make OTT — especially for sports — so compelling. U.S. content consumption is 23 percent traditional TV, he reported, with a whopping 54 percent online. “The only time there’s competition [from] traditional TV is in the early morning and prime time,” he said. “And the number of video minutes consumed on tablets and mobile devices has increased 954 percent between 2011 and 2014.”

Live sporting events are the logical focus for OTT services. “Consumers really want live sports, well beyond the TV set,” Saleem said. “Sports consumption is 75 percent of all live viewing on the Internet, and 63 percent of U.S. adults use the Internet to consume sports.” The trick is getting sports to OTT platforms in as little time as possible. The current workflow to find an important moment, transcode it and publish it to OTT platforms takes an average of 55 minutes.

PFT’s Clear offers a time savings of 90 percent, claimed Saleem, taking four minutes to publish. PFT does that by finding the important footage more quickly and automating simultaneous delivery to all OTT platforms. The platform doesn’t yet do ad insertion but PFT is in the process of creating that option, said Saleem, noting that all Clear installations require some level of customization. “Interacting with metadata is the most important step, and that takes work,” he said.

Comcast USA’s Yassar Syed talked about decoupling content generation and delivery techniques to facilitate distribution of programming to multiple destinations, including OTT. “It involves dealing with scalability and reliability of services,” he said. “We’re looking at adaptive-bit-rate streaming technology, which we’re using in our IP transformations, and we’re seeing the benefits from that.” USC engineering graduate student Arnav Mendiratta, who says he consumes all of his media over the Internet, spoke about using big data analysis to help monetize OTT and described a project he’s working on to do this via customer-targeted advertisements. He also noted the challenges ahead, including a lack of standards for delivery platforms, the continuing adoption of ad-blocking software, and limited bandwidth.

Case Studies in Archiving
The proliferation of digital formats and platforms is also stressing archivists. SMPTE 2015 highlighted two case studies: The Library of Congress National Audio Visual Conservation Center (NAVCC) and the Montreux Jazz Festival in Switzerland. The Library of Congress is the world’s largest research institution, said NAVCC senior systems administrator James Snyder, with more than 15 million web searches a year and the world’s largest archive with more than 160 million items. It also houses one of the world’s largest media collections, with approximately 7 million items, some of them 120 years old, from Edison wax cylinders to the latest digital files.

To accommodate constantly changing formats, NAVCC migrates everything every three to seven years. “Copying physical media to physical media was always the norm for preservation,” said Snyder. “There are still physical objects on shelves, but they’re decaying and sometimes quickly. Even motion-picture film doesn’t last forever.” Metadata, he said, has traditionally been yellow sheets of paper stuffed into tape boxes or film cans.

NAVCC has been archiving material on digital files for some time, but he noted that “we’re preserving the past as files, so we need to think about what goes into preserving past file formats that are not the norm today.” There’s no easy answer, he said, on how to handle old file-based formats. “It’s a case-by-case answer,” he said. “Some items won’t survive.” Just as he has half a dozen videotape playback machines for every format that’s existed, so he kept old computers that are often the only way to play back older material. His advice? “Plan for change and obsolescence,” he said. “Get comfortable with migrating digital assets every three to seven years. Data tape has the best cost-benefit for most medium to large collections.”

Noting that only an estimated eight percent of films made prior to 1929 have survived, he imparts his last lesson learned. “If there’s one thing we’ve learned from digital storage, it's that it is depressingly ephemeral,” he said. “Digital recordings don’t last anywhere near as long as the old analog ones. So playback and capture as uncompressed as possible.”

Alain Dufaux, ‎project manager at Swiss technical university EPFL’s MetaMedia Center, and Walter Hinton, HGST director of field marketing, presented their work creating a digital archive from 10,000 tapes from every Montreux Jazz Festival since its 1967 founding. “Our friends at the Jazz Festival went from VHS tapes in 1970 to two-inch tapes and then U-matic,” said Hinton. “They have every tape format known to man. How do you read it all back?”

The question, he said: is it possible to replace tape with magnetic disk arrays as a more media-friendly format for recording and playback of rich media content? “Disk has come to a price point where the benefit of its performance and accessibility are greater than its slightly higher price,” said Hinton, who predicted that over the next several years, the industry will see more migration to disk-type storage. “It’ll grow to 10 percent of all use cases by 2020,” he predicted.

There are, however, challenges to disk technology. “As drives get bigger, the propensity for drive failure gets bigger,” said Hinton. “And disk densities force you to write in a smaller way. You face the challenges of physics.”

But, he added, there are methods to overcome those problems. According to Dufaux, the new digital reference archive consists of uncompressed audio and video formats, a 2.5 PB hard drive system to store and stream the archives, a central database, and two copies on LTO tapes. For the 2014 Montreux Jazz Festival, they also debuted a live capture of uncompressed HD from three festival halls, which was sent to three EPFL workstations and transferred via 10 Gig Ethernet to EPFL archive storage.

But When Will Technology Catch Up to Silent Films?
Buster KeatonFor those with an interest in film history, Jonathan Erland’s presentation on how to digitally project silent-era films was a real treat. Erland’s Pickfair Institute for Cinematic Studies is working with FotoKem, the Academy of Motion Pictures Arts & Sciences, Texas Instruments and Qube among others. “Using appropriate projectors, we’ve always been able to show silent films at their original frame rate,” said Erland. “But that won’t be true forever. Our goal is to protect these films digitally.”

He convincingly showed that, in the silent era, projection itself was an art. Director would leave instructions to the projectionist on when to speed up or slow down, but the projectionist was ultimately in charge of the frame rate, which ranged all over the map, but was most commonly 16, 18, or 20 fps.

The introduction of sound on film introduced 24 fps and the currently required conversion of variable slower frame rates to 24 fps introduces artifacts. Erland noted that the current interest in high frame rates, sparked by director Peter Jackson and James Cameron among others, has made this a perfect time to address a way to project lower — and variable — frame rates.