It wasn’t really a razzle-dazzle show at NAB this year. Production trends that seemed like they could be the future of media in years past — stereo 3D post-production, 360 video acquisition and VR-headset experiences — have faded into the background, becoming just another bullet point on a marketing sell sheet. Formerly eye-catching gadgets like handheld camera stabilizers have become a more familiar sight, and drones were nowhere near as prevalent on the show floor as in previous years. (Drone giant DJI didn’t even exhibit on the show floor this year.) And futuristic technology like light-field cinematography, which blew so many minds in 2017, was largely absent from the discussion this year. What took its place? Practical, short-term concerns. Customers were interested in sending video over IP networks, taking advantage of powerful GPU acceleration, and getting media into the cloud to leverage AI-based tools, such as Amazon Rekognition, that can help better identify, tag and monetize content.
Here’s a look at the top trends we saw on the show floor.
The subject was broadcast production. Some years NAB news is dominated by a new cinema camera, developments in high-end post-production, or enthusiasm around potentially game-changing paradigms like 3D, VR or light-field cinema. Other years, it’s a meat-and-potatoes show. That was certainly the case in 2019, when companies were tripping over themselves to make sure you knew their products would be at home in a broadcast studio or ENG environment, whether or not that’s where they were originally intended to be used. To name some examples, Sony debuted an ENG build-up rig for its FS7 cameras, and Panasonic touted live production workflow built around its AU-EVA1 cinema camera.
IP Video: Still on the horizon. The migration to IP infrastructure for video has been moving slowly, but it is moving, and some of the pieces came into clearer focus at NAB. For one thing, there was the acquisition, right before the show, of live streaming pioneer NewTek by broadcast graphics specialist Vizrt. The news came as a big surprise, but maybe it shouldn’t. As NewTek Product Marketing Manager Matthew Allard points out, “both companies are about software, networking and IP.” True, NewTek developed the NDI protocol while Vizrt has championed SMPTE 2110, but that means the merger should be good news for interoperability between the two standards. Companies like Sony and Grass Valley were demonstrating advanced systems that allow production resources to be shared across multiple locations, studios and control rooms, which promises to reduce the amount of redundant equipment purchased by a broadcast operation. And the IP revolution is infiltrating every corner of the business. Primestream has a new Media I/O desktop application with a multiviewer allowing monitoring of IP streams along with SDI signals. CueScript showed the CSMV2, a prompting monitor due in August that supports SMPTE 2110 (with patent pending). And Phabrix was showing off test equipment supporting SMPTE 2110-compliant IP workflow over 10GigE.
Unreal Engine is not playing games. No, we’re not talking about esports, though that segment of the media landscape certainly qualifies as a phenomenon worth watching. We’re referring to the increasing movement of videogame techniques into real-time workflow for live broadcast. For example, The Future Group’s Pixotope is a real-time renderer built on the Unreal Engine from Epic Games, which it uses for both foreground and background rendering. If you’ve seen some of The Weather Channel’s latest forays into real-time weather simulation, like its storm surge visualizations, you’ve seen Pixotope at work. Unreal Engine, which received its Technology and Engineering Emmy at the show, is also being used to render virtual sets for NFL Network’s RedZone, Fox Sports, and more. One sample Unreal workflow on the show floor involved a HP Z8 G4 workstation loaded with an Nvidia Quadro RTX 6000 graphics card supporting hardware-accelerated ray tracing of images for output via Bluefish444’s Epoch 4K Neutron — and the potential of those RTX cores is just starting to get unlocked.
8K is the new 4K. It was pretty much official this year — 4K is largely a solved problem for both production and post (if not for broadcast delivery), and 8K is the new cutting-edge delivery spec that turned heads on the show floor. Sharp’s consumer-oriented 8K micro-four-thirds camera generated a lot of buzz considering its reported $4,000 price point, but devices like Panasonic’s AK-SHB800GJ, an 8K box camera capable of generating four different cropped region-of-interest images at HD resolution, looked like more practical investments for broadcast. Meanwhile, Blackmagic Design jumped into the deep end of the pool and made a big splash that got everyone wet — Blackmagic showed a new Teranex Mini SDI to HDMI 8K for monitoring 8K HDR on a large-screen TV or video projector via HDMI; a HyperDeck Extreme 8K HDR H.265 broadcast deck with a shuttle knob and light-up push-buttons; and the ATEM Constellation 8K 4 M/E live production switcher, with 40 12G-SDI inputs and 24 12G-SDI outputs. And AJA’s 8K strategy leaned heavily on its Kona 5 video card, coupled with a revision to its Desktop software that enables 8K ProRes on Windows and Linux as well as MacOS. AJA also had a new Corvid 44 12G-SDI card supporting 8K at up to 60p.
Lenses with character. It’s counter-intuitive, but along with the move toward higher-resolution cameras, we’ve seen a trend toward vintage optics and contemporary glass with “character.” As DP Matthew Libatique, ASC, asked the audience during his NAB Q&A, “Have you seen the price of a Canon K35 these days? It’s astounding how much these lenses are.” Why is that? For one thing, crisp 4K images can be unforgiving where human subjects are concerned, especially when it comes to skin tone and texture. Vintage lensing can give those high-res digital images a warmer, less clinical look. And, increasingly, digital cinematographers are looking at lenses the way 35mm filmmakers use to look at film stocks — one more way to influence the look of the finished image. That’s why Cooke Optics continues to embrace the “Cooke look,” a hard-to-describe quality that combines sharpness and high contrast with an exceptional smoothness. And that’s why Canon explicitly emphasized the “character” of its new Sumire cinema prime lens set, introducing them as “more for the creatives and not as much for the engineers.” We’d expect this trend toward specially tuned glass may accelerate as DPs look for more ways to stand out from the crowd.
The forecast is partly cloudy. This was the first NAB where the usefulness of the cloud seemed to be taken as a given. With the economics of cloud storage becoming more transparent and predictable — and new, cloud-based machine-learning algorithms for metadata enrichment making it more attractive than ever to put media in the cloud — storage vendors were promoting hybrid cloud workflow options that seemed more credible than ever before. “Small content producers are going to be a larger part of our business,” predicted Masstech CTO Mike Palmer, pointing to the company’s new Clover product, a budget-friendly ingest, storage and transcoding system that works with tape or cloud storage. And Facilis Technology offered Facilis Object Cloud, a disk-caching system for its Hub and TerraBlock servers that allows access to 100 TB of cloud storage for an annual fee. Developed in partnership with XenData, “it gives users a way to get to cloud and LTO archives without an overarching MAM system,” Facilis VP of Sales and Marketing Jim McKenna told us. Speaking of LTO, this was also the first time it seemed that the end may really be in sight for LTO tape’s reign as king of the long-term archive, as low-cost cloud storage starts to look more cost-effective — and may offer new opportunities to monetize archived media. “Most AI services will be in the cloud,” said Arvato Systems portfolio manager Ben Davenport. “To take advantage, it will be necessary to be hybrid cloud, so that will be a driver.”
Whither HDR? There didn’t seem to be a lot of movement on HDR specifically at the show. In addition to some enduring confusion over different standards for HDR (including Dolby Vision, HLG and HDR10+), the sticking point seems to be proper monitoring of HDR pictures. Sony’s BVM-HX310 monitor is being marketed as a lower-cost alternative to the BVM-X300, which has to help, but capable HDR monitors are still a relative rarity. Some vendors are trying to close the gap with half-measures — as Oliver Peters points out at his Digitalfilms blog, TVLogic has introduced “HDR” displays that reach a respectable 350 nits of brightness, but nothing near the 1,000 nits generally expected of an HDR screen — and some colorists have been using consumer HDR OLEDs as client monitors. At the same time, products like AJA’s (and Colorfront’s) HDR Image Analyzer will be critical to colorists and post-production supervisors struggling to keep tabs on their HDR images with limited experience under their belts. “It kind of helps us do HDR triage, understanding where the limitations are going to be in a bigger color volume,” said Rory Gordon, senior colorist at ArsenalFX Color, at an AJA press conference. Colorfront had its Transkoder 2019 software at the show, with a new HDR GUI and support for UHD 8K workflow (with the help of those Nvidia RTX graphics cards and AJA’s Kona 5), and the company announced a partnership with Pomfort to integrate Colorfront’s technology with Pomfort LiveGrade Pro for on-set color workflow and dailies. For broadcast trucks combining HDR and SDR technology, Cobalt Digital showed the 9904-UDX-4K, a cross-converter and image processor with HDR-to-SDR and SDR-to-HDR conversion capabilities.
A new Avid. OK, a new software revision from one company may not qualify as a “trend.” But when that company essentially invented the NLE — one that remains the de facto standard for feature film and scripted television editing — a serious redesign can feel like a seismic event. That’s the case with Avid Media Composer. Media Composer has remained Media Composer through years of upheaval in editing technology, despite losing some competitive ground over the years to Apple’s Final Cut Pro and Adobe’s Premiere Pro. If you talk to many longtime Avid editors, you’ll discover that they really love the Avid, and that familiarity helps explain why Avid has been in no hurry to overhaul the basic Media Composer experience. But that changed this year, as the company rolled out what it called an “all-new” Media Composer with a modernized user interface, new visual approaches to media bins, and 32-bit full float finishing and delivery capabilities. The big question is whether Avid has managed to thread the needle, upgrading Media Composer editors to a better, more efficient experience without altering the basics of the program that made them Avid loyalists in the first place. “It’s on the minds of many,” admitted Avid’s director of business development, Rob D’Amico, at the show. But he also said Avid had spent the last 18 months in conversations with existing customers and non-customers alike in order to ensure that it was going in the right direction to satisfy prospective new users without alienating existing customers. It’s a make-or-break moment for Avid that could reverberate through post-production. The new Media Composer is due later this spring.
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.