The HPA Tech Retreat this year was bigger than ever and just as fascinating, as it touched on topics as disparate as high dynamic range, high frame rates, ACES, artificial intelligence and computer vision. Here’s a high-level summary.

High Dynamic Range (HDR)

Panelists described the ongoing integration of HDR into production and post pipelines, as well as consumer devices. Dolby VP, Technology Pat Griffis clarified the basic definition of HDR which, he said, is more accurately characterized as color volume. “HDR has taken the lid off Pandora’s box and made many more colors available, not just contrast but brightness,” he noted, defining color volume as “the 3D palette of all the colors that can be reproduced at all allowable intensities.”

Sony Pictures Entertainment CTO Don Eklund said adoption has been slow in part because “we haven’t yet gotten [the consumer] to understand that it’s a new TV system.” The HDR pipeline is also not entirely complete. BBC Worldwide Production Standard Liaison Andy Quested reported that HDR reference displays and closed captions remain unresolved issues, and Samsung Research America VP of Industry Relations Bill Mandel noted that calibration of HDR screens is another challenge.

Pixelogic SVP and HPA President Seth Hallen asked panelists to predict the future for HDR; Eklund pointed out the lack of HDR content is in part because not all filmmakers make the creative choice to use HDR. “But live content is a no-brainer,” he said. “You really missed something if you didn’t see the Olympics in HDR.”

ACES Update

It’s no longer just feature films and episodic TV using the Academy’s ACES color-management and image-interchange system, but also games and corporate clients like Ikea, reported Annie Chang, ACES project chair and VP, creative technologies, Universal Pictures. In the last year, the ACES group also established a library remastering pipeline by digitally remastering a short film, “The Troop.” The team spearheading ACESnext, the system’s next incarnation, is currently on a listening tour with various working groups, a process that will be completed by end of March 2018.

ACES images are now being combined with the IMF international standard for file-based interchange of multiversion finished audio-visual works. “With increasing adoption of ACES master files, IMF provides a framework for the file-based exchange and archiving of content as compositions,” said Dr. Wolfgang Ruppel, professor at RheinMain University of Applied Sciences. “It solves the problem of a data structure for delivery and archiving of final ACES master file sets, along with audio sound fields and timed text.”

Metadata Guidelines for Home TV Displays

Although ACES maintains image integrity from the creative through the final master, once the content is aired (or streamed) to the home TV, it all falls apart. International Cinematographer Guild (ICG) Advanced Production Technology Specialist Michael Chambliss reported that only 2 to 10% of consumers actively manage their TV settings. ICG President Steven Poster, ASC, said that 20 years ago he and Rob Hummel, now president of Group 47, described a solution whereby “the material being broadcast could control the TV with what was beginning to be called metadata.” Now, Chambliss, Poster, Hummel, Wendy Aylsworth and others propose a working group under SMPTE 2094 “to explore metadata guidelines for downstream image presentation management.”

“By using metadata to maintain information, we’ll preserve for all time how it was intended to be seen,” said Hummel. The TV set might not be able to adjust for ambient light or the wall color in the viewer’s home, but at least the metadata would adjust the TV set itself to respect the material’s creative intent. “We have the first real step in that direction with ACES,” says Poster. “We have to complete the circle and at least indicate the intent of the artist.”

Samsung Research America VP, industry relations Bill Mandel says the time is now to meet with TV-set manufacturers. Dan Rosen, manager of image processing for Karl Storz Imaging, suggested the capability could be marketed by TV manufacturers as a differentiating factor.

ATSC 3.0 and Broadcasters

Ericsson SVP and SMPTE President Matthew Goldman moderated the annual broadcasters panel, looking at how ATSC 3.0 is progressing in the U.S. 21st Century Fox executive vice president Rich Friedel, who is also ATSC chair, noted that the new standard has been released with “implementations in progress.”

“Our friends at Sinclair have stations in Baltimore and Washington area with Dallas on tap,” he says. “The Pearl Group, in conjunction with Fox, has a model station on the air in Phoenix. ATSC 3.0 is real and you’re going to start seeing it on the air.”

Cox Media Group VP Dave Siegler, Sinclair Broadcast Group SVP and CTO Del Parks and NAB Senior Director, New Media Technologies Skip Pizzi predicted broadcasters will gravitate to ATSC 3.0 for its efficiencies and robustness. “What excites me is the marriage of broadcast and OTT,” said Friedel. “You can have an internet and broadcast experience mixed together.”

Although TV stations could, in principle, become ISPs, it’s unlikely that they’ll duplicate an existing product. The vision, they say, is to broadcast a main channel of video and audio over the air to the broad audience, with additional services such as graphics or alternate audio tracks available to savvy consumers. The end user would be able to personalize the content via menu-driven operations, seamlessly mixing broadcast and OTT content.

With regard to the 3.0 bandwidth, Pizzi noted that 4K is the bandwidth hog, not HDR. “Most of the TVs do a pretty good job upconverting,” he says. “That’s probably the easiest thing for the system to do.” He also noted that the Ultra HD forum considers 1080p/60 with HDR to be a (streaming) Ultra HD format, “although the U.S. alliance doesn’t necessarily agree.”

High Frame Rate (HFR)

RealD Senior Scientist Tony David and cinematographer Bill Bennett, ASC, showed the results of their work shooting HFR footage and then finessing the apparent frame rate in post, using RealD’s TrueMotion software. “One of the challenges as we move towards HDR on displays is we’ve discovered high-contrast images tend to judder as they move across the screen,” said Bennett, who added that, having seen The Hobbit at 48fps and Billy Lynn’s Long Halftime Walk at 120fps, he’s not a fan. “In a movie, you want to be told a story,” he says. “The closer the images are to real, the harder it is to imagine yourself inside the story. But I’d pay money to see the World Cup or Super Bowl in those formats.”

Being able to capture footage at high frame rates and then go down to a cinematic frame rate in post, he said, was the best of both worlds. “You can also solve flicker problems in post that way,” he said. “The key thing is the creative aspect is done in post. I don’t have to make these decisions in photography.” The two showed several pieces of footage they’d worked on together, including a close-up of Mikaela Shiffrin riding a bike in the forest (she was actually on an exercise bike in the back of a truck). “We could render this in many frame rates and shutter angles in post to produce something the director liked,” says Bennett.

They also showed demonstration footage, shot by Bennett with dual ARRI cameras, where he was able to control motion blur of the water spraying out of a hose and, in another scene, get rid of the wagon-wheel effect as it manifested in a rotating airplane propeller. “You may want the wagon-wheel effect, but this is a mathematical test to show we are doing what we say we can do,” says Davis. “Shooting at 120 and slowing down, we can keep the shutter constant. It doesn’t look like a slowdown, which is what we want.” In post, windows could be used to render different parts of the image at different frame rates. “It’s a pretty powerful creative tool to have two motion sets in a scene,” said Bennett.

New Technologies: LEDs, Computer Vision, AI, VR

It wouldn’t be the HPA Tech Retreat without some tantalizing technology pointing to the future. Mission Rock Digital CTO Pete Ludé proposed that the digital projector will soon be replaced by direct-view cinema displays using LED technologies. “It offers dramatically improved HDR, blacker blacks, very high peak luminance,” said Ludé, reporting that Barco has experimented with a very-wide-field-of-view display that can be implemented with an LED screen. LEDs also enable a “reasonable amount of ambient light,” which could be a boon for cinema/restaurant combinations.

LEDs are becoming viable as a display choice thanks to Haitz’s law, which shows continuous improvement in light output paired with a continuous drop in price. With some tweaks in manufacturing, the prices will become more competitive. Also in development is MicroLED technology, which Sony pioneered with its Crystal LED product and is now being explored by numerous manufacturers. “It’s potentially not just for watches and phones, but cinema-size screens,” said Ludé.

USC Entertainment Technology Center Program Director Yves Bergquist reported on his work in artificial intelligence and storytelling. Looking closely at the semantic structure of narrative yields interesting results, he argued. “There is very consistent structure in the kind of stories that are popular,” he said. “They tend to have a consistent ratio of aspects that are traditional in that genre and very new attributes on top.” Bergquist’s project looks at a wide variety of parameters such as music, emotional tonality, color, editing pace, white balance, composition and more, and is creating a classification scheme for these attributes.

RealNetworks CTO Reza Rassool showed a facial recognition system that can look at any face and determine gender, age and emotion, as well as pull from a database of 50,000 public figures to find faces that are similar. The facial recognition engine is just an example of what can be done with objects to empower T-commerce, the ability for the consumer to buy items from the TV set, which is already big in China, he said. Rassool envisioned mainly up-and-coming OTT sites adopting the system, which can be monetized by auctioning the products shown, much like selling advertising slots. Rassool, who will present a paper on the topic at NAB, says one of the system’s earliest customers is a big U.S. hamburger chain.

360 Designs founder and CEO Alx Klive said Volvo’s sponsorship of CNN’s broadcast efforts made it possible for the live VR solution to bring the recent solar eclipse to national and international audiences. The two-hour long broadcast used multiple 360-degree cameras that were switched in a 4K mobile truck and integrated with 360-degree CG elements from the broadcaster’s in-house graphics division. The production was live from seven locations, using his company’s Mini Eye 3 cameras. The output went to a satellite downlink facility in New Jersey where the images were live-stitched. The result, he claimed, was “the most watched live VR experience in history,” proving that sponsorship is a good basis for live VR productions.