Avid's name has remained synonymous with high-end editorial for film and television for decades, and the company has retained a leadership role despite fierce competition. At this year's Academy Awards, for instance, every film nominated for Best Picture was cut on Avid Media Composer, along with all five nominees for Best Film Editing. Six out of 10 sound editing and sound mixing nominees worked on Avid Pro Tools, along with all five contenders for Original Song and two of the Original Score nominees. We spoke with Avid's Matt Feury, director of pro video product and segment marketing, about how editors are working at high resolution in the Avid and what he sees driving demand for 4K workflow.

StudioDaily: We hear a lot about 4K these days, though HD and 2K are still the standards, at least for delivery to consumers. How are Avid customers getting their jobs done right now, and what are they thinking about, going into NAB, for the rest of the year?

Matt Feury: The thing that people look for, ultimately, is flexibility. There are so many open switches that can be thrown in any workflow. By and large, HD is still king as a deliverable, but the days when you didn't need to protect for 4K are gone. You absolutely need a 4K master, and you need to begin thinking about — and I can't believe I'm saying this — 8K. Media Composer already supports 8K, but the infrastructure just isn't there for hardware or storage in any realistic way. It's nice that the application is ahead of the curve but, at least in the North America market, 8K is something to chuckle at and not really plan for.

What's the typical workflow for 4K? Are people using proxies or editing at full resolution?

Most people are working in HD in some kind of proxy, although our application and our storage — even our Nexis Pro storage, which is fairly new and targeted at smaller facilities — support real-time 4K workflow, and not just our own codec, DNx, but whatever you're working with. But it usually makes sense todo some kind of transcode, so you have more fredom in editorial, and then conform to the original source or a high-resolution version of a proxy. Let's say my source is Red at 5K. I'll use background transcoding and features like Media Composer's dynamic media folders to automate the process of transcoding so I create a low-bandwidth version of DNx as well as a high-bandwidth mastering-quality verison of DNx, which is still a little more storage- and processing-friendly than the actual source media.

The thing most people don't realize is you can start editing source media while that transcode is going on in the background. Media Composer is good at media management and being able to relink on the fly between proxies and new transcodes and source media. I can kick off that transcode in the background, start linking to and cutting with my Red material in real time, and then as the DNx version comes online I can flip over to that and get more processing power without eating up the same amount of storage. Another thing that comes up a lot is source as a negative, like a camera negative. If I bring in my source media, those are my masters. I still want a copy of that somewhere. So I'm putting that into slower cold storage. That might be a way we could repurpose those USB and FireWire drives I have kicking around, or other older, lower-bandwidth storage.

You said the entry-level Nexis system will handle 4K. So I don't have to ask, "Which Nexis system will support my 4K workflow?"

Not at all. All of them support that. As you go up the chain, you get into more enterprise-class features like redundancy and high availability, where you have dual everything. But outside of that, the technology is really the same. If I'm a small, independent shop with five editors and I go out and get a Nexis Pro for under $10,000, I have the same technology, the same interface and the same management system that NBC uses to show the Olympics. That's the nice thing about how that scales. At the end of the day, it all comes down to the software.

There's a lot more in the Avid ecosystem than just Media Composer running off Nexis storage. What else is Avid addressing when it comes to high-resolution workflow?

I think we're getting back to specialization. As much as we celebrate the one-man-band ideal, where it's possible for me to do everything in one box, I think people's talents are still the drivers. You're going to have your rock-star colorists and your audio mixer and your sound designer, and a video person can never approach what they can do. So it's about connecting the right talent, wherever they may be, and that's where the platform takes over. I can work with you just as seamlessly as if you were in the next room, doing the audio sweetening while I was finishing the third act. It's about removing the barriers to collaboration and connecting people together — and that's what Avid's always been about.

What trends do you see driving the industry to 4K?

What's driving 4K is over-the-top (OTT), the YouTubes and the Amazons. They're trying to distinguish themselves from traditional broadcast, not just with 4K but also HDR. You're starting to see that there and on Blu-ray, but obviously not terrestrial broadcast yet. But maybe that changes sooner than we think. The growth in YouTube and Netflix and Amazon has been ridiculous in the last couple of years. FX Networks did a cool study that they published at the end of last year on the increase in original scripted content. It wasn't just online but across the board, and it showed growth in every sector — online was obviously huge, but also cable and traditional broadcast. It was clear that everybody's growing. The need for more content is greater than ever.


fx-study

Source: Study by FX Networks Research


Speaking of HDR, how is HDR handled by Media Composer? 

We support it. We are agnostic to it. It flows through editorially. The big thing with HDR is the monitoring, and that's still cost-prohibitive for a lot of people. The Dolby Vision stuff is coming down in price and they're doing some cool things, but you're still looking in the tens of thousands of dollars for the right monitor to do it. It's becoming a specialist thing. "We'll do traditional editorial and standard Rec. 709 color-correction, but somebody else is going to have to do the HDR treatment."

You mentioned earlier that 8K isn't something we need to be especially worried about yet, and then clarified that you were talking about North America. But obviously Japan is pushing forward. So how far is the horizon for 8K in North America? Do you see that becoming part of the broadcast landscape in the near future — say two to three years?

No, I don't see it. There are arguments to be made that what's important is not so much the resolution as the compression, how the color is treated, [and] how much contrast you have, which is why HDR is such a big topic. The difference with Japan is there are a lot of manufacturing interests over there selling new technology, with giants like Sony ready to spit out TVs that can do 8K or infrastructure that supports 8K. And then you've got the government pushing NHK to get 8K out there. They will always be leading the way in terms of trying new things. It's getting harder to stand out from the pack, with so many people creating content, so someone will always be leaning into it. I'd say we're definitely more than three years away from 8K in the states.

Any final thoughts on 4K?

I think, rather than seeing changes on the source side, it's the mastering side that becomes more important. Can we find a standard there? There's a lot of talk about IMF and delivering to these entities like Netflix and Amazon who need 4K masters. On the source side, in Media Composer, we'll take whatever you've got. But what's the ultimate deliverable? And how do we make that part easier? Maybe IMF is the path forward. That's where the real discussion is shifting — delivery.