What was new at NAB 2018? Not a whole lot.

I don’t mean to be glib about it. Sure, NAB was a showcase for refinements and expansions of existing technology, and some of those will have a significant impact on production and post in the coming year. But at least as far as major product launches, this year’s show came up a little short compared to past outings. Some key product introductions took place well in advance of the show — the latest cameras from ARRI and Panavision were actually announced at BSC Expo in February — and the pace of innovation has been slowed somewhat by the reluctance of North American broadcasters, at least, to commit to new equipment and infrastructure on the heels of investments in the initial digital transition that they’re still trying to recoup. So instead of entirely new products or product categories, this year saw the emergence of common standards for bringing workflow under control as standards advance.

As a result, interoperability was a theme. Notably, vendors of products with IP capabilities, some of them developed years ago using proprietary technology, sought to demonstrate compliance with the latest ST 2110 standards from SMPTE for media over IP networks. Similarly, vendors were working to integrate ST 2067, SMPTE’s draft specification for the Interoperable Master Format (IMF) for broadcast and online delivery, into broadcast workflow, which should bring some new clarity to the wild range of deliverables that broadcast, web and OTT channels demand on a global basis. (See Deluxe’s Deluxe One initiative for another ambitious approach at centralizing content distribution for the era of peak TV.)

Here’s another example of the value of interoperability. Apple’s pre-NAB announcement of the ProRes Raw format was neither new (ProRes has been around for many years) nor especially innovative (raw workflow was pioneered years ago by the likes of Red and Cineform). But it’s significant because it promises a unified raw workflow and easy post deliverable across a number of cameras, including the Canon C300 Mark II and C500, the Panasonic AU-EVA1 and VariCam LT, and the Sony FS5 and FS7. Some observers scoffed at the idea of a new pro codec that only works in Final Cut Pro X — but ProRes was originally introduced as a FCP-only codec, too, and it’s now part of the cross-platform editorial ecosystem. If ProRes Raw works as expected in a variety of shooting scenarios, it could bring raw workflow to a broader swath of the post market than ever before.

Well, that does sound pretty interesting. So let’s try again. What was new at NAB 2018?

Ikegami 4K and 8K signage at NAB 2018

A giant light-up sign reading “4K” just wasn’t enough this year — you needed one that said “8K” to make an impression.
Bryant Frazer

8K. With 4K having reached global acceptance as an acquisition and production format, thanks in large part to the insistence of the pixel-counting regime at Netflix, the next frontier really is 8K. The big question remains: where can you see 8K content? North American broadcasters have made no move in that direction, though NHK in Japan has been broadcasting 8K content to public viewing stations on a trial basis. To prove the point that 8K is ready for prime time, NHK brought an 8K broadcast camera capable of shooting 8K images at 240fps, crucial for getting those slow-motion replay shots on the sports field. Meanwhile, many NAB attendees were surprised to see Sharp on the show floor, where it exhibited for the very first time. Sharp dove into the deep end of the 8K pool, bringing out an 8K camcorder, a 70-inch 8K video monitor, and an 8K multidisplay with a 104-inch wide screen and a superwide 22:9 aspect ratio. (For more on that 8K camera, see Marc Franklin’s coverage of his Day One on the show floor.)

One thing’s for sure — 8K content is best viewed on a large screen. In the photo above, you can see a showgoer in the lower-left-hand corner of the image scrutinizing Ikegami’s smallish 8K display from a distance of about 12-18 inches; odds are he’s wondering if he, or anyone else, can really tell the difference between 4K and 8K on a screen of that size. NHK came to the show with 70-inch, 85-inch and 98-inch displays, but Sony was the likely winner in this department. At its booth, Sony was screening 8K x 4K Rio Carnival imagery captured at 120fps and in HDR with its three-chip 8K UHC-8300 broadcast camera on a massive 32 x 18 foot Crystal LED screen. The results were genuinely dazzling — nothing else I saw at NAB 2018 came close.



E-sports. This was the year the e-sports market — competitive videogaming — showed up on vendors’ radar. That may be thanks in part to widely reported market research that suggested the e-sports industry generated as much as $1.5 billion in revenue last year. Esports has largely been an online phenomenon (think Twitch and YouTube streamers) but as the audience grows and traditional broadcasters think about ways to cover e-sports events, gadgets and gear that help integrate video streaming from game consoles into traditional broadcast workflow will be in demand. As an example, see AJA’s KONA HDMI, a simple and inexpensive ($895) PCIe card offering single-channel 4x HDMI capture at up to 4K 60p or four-channel HDMI capture at up to 2K 60p. Audio-Technica was positioning its new broadcast stereo headsets for esports applications, as well.  The lingua franca of console streaming is HDMI, so even if you’re not dabbling in e-sports, if your own special snowflake workflow incorporates HDMI sources, you may want to keep an eye on solutions that are spun toward this market in case they help you solve some of your own problems.

Full-frame. We figured as much, but Cooke Optics Chairman Les Zellan confirmed it for us in a conversation at the Cooke booth — the big trend this year was toward full-frame lenses, with anamorphic glass still coming on strong behind them. Long found inside digital still cameras, sensors in the full-frame form factor (typically 36mm x 24mm or thereabouts) are increasingly making their way into cinema-style cameras, where they generally improve low-light performance and allow cinematographers a wider range of creative choices — not just the sliver-thin depth of field often associated with them — with simpler, less time-consuming lighting set-ups. Last year at this time, sensors from Red were the only show in town when it came to full-frame cinema shooting, but since then Sony (with the Venice system), Canon (with its just announced C700 FF) and ARRI (with the Alexa LF) have all embraced the larger form factor. That means full-frame lenses are in demand from Cooke and others — though Zellan couldn’t resist noting that no single form factor can rule them all. “Our S4 [35mm/Super 35] primes are still backordered after 20 years,” he said.

Hitachi SK-UHD4000

Hitachi SK-UHD4000

HDR. There’s a conundrum at the heart of HDR acquisition — the format provides unparalleled color reproduction, retaining detail into the brightest specular highlights and providing a stunning viewing experience in a dark room. But how do you acquire or master HDR content if you can’t afford a $30,000 reference monitor capable of reproducing HDR’s full dynamic range? Well, more than one vendor was displaying HDR content on a consumer display — an LG OLED screen seems to be the model of choice, especially as a client monitor. Working with ColorFront, AJA previewed its new HDR Image Analyzer, a rack-mounted box with an array of HDR-aware tools including waveforms, histograms, vectorscope, and brightness level monitoring that should make it easier to catch any egregious errors in HDR footage. And, because broadcasters need to produce both SDR and HDR, camera manufacturers were highlighting new workflow options, such as Hitachi’s SK-UHD4000 studio camera, which offers simultaneous HDR and SDR output, and Panasonic’s AK-UC3000 studio camera, which supports UHD and HD/SD output simultaneously via the AK-UCU500 camera control unit.

Immersive content. Enthusiasm for VR was muted this year, with even the term virtual reality being downplayed — the “Virtual & Augmented Reality Pavilion” of year’s past was replaced this time around by the more modest “Immersive Storytelling Pavilion,” where 360-degree camera maker Insta360 enjoyed the flagship position formerly occupied by Nokia, which got out of the Ozo camera business last fall. Other vendors in the space included Samsung, which exhibited its 360 Round VR camera along with Radius Live live streaming and Radius Edge post-production systems from partner Next Computing, Kandao, with its Obsidian 3D VR Camera, and SGO with its Mistika VR stitching software. Some vendors were touting 8K acquisition as the natural next step in VR experiences, but others were looking past the headset entirely.

In a suite at the Renaissance Hotel, I met up with Dimenco, which positions itself as a technology lab looking to develop immersive experiences without requiring “wearables” like 3D glasses or VR goggles. Dimenco had me sit in a chair at the sweet spot facing a massive autostereoscopic display and in the middle of a finely tuned multichannel sound system. On a small table in front of me was a development-kit version of a cutting-edge haptic controller that provides tactile feedback in mid-air by projecting ultrasonic vibrations upward. I was advised to hold my hand 10-15 centimeters over it for best results. Dimenco’s demo was a simple first-person videogame set in a cartoonish jungle that I navigated on rails, using my hand to “look” left and right on screen. When an icon appeared on screen, I was to push down with my hand as if pressing an invisible button in mid-air to take pictures of animals in the game. The experience was awkward, and I felt a little foolish, though I did get a bit of a thrill when I realized that I really could feel something pressing back against my hand when I pushed down on that imaginary button. To be honest, I’d find just about any VR-headset based experience to be more immersive than the big-screen TV, autostereo or no, especially since the 3D view on screen would phase in and out if my head moved even slightly from side to side.

But, as Dimenco’s reps pointed out, their demo was purely exploratory and relied on pre-beta technology to try and tease out the possibilities for future implementations with more mature systems. As an example, they suggested they’d like to co-opt virtual assistant technology to add voice interaction to the experience. For my part, I tried to imagine what would make the glasses-free experience more compelling, and I thought a much bigger screen might help. Not that I’m holding my breath, but if someone could supply Dimenco with an autostereoscopic version of that 8K crystal LED screen in Sony’s booth? Yeah, that would really be something. The worst thing about today’s immersive experiences is not only that we have to put on a piece of isolating eyewear to experience them, but also that we’re stuck with only the technology we can make, not the technology we can imagine.