Epic Games released a new version of its Unreal Engine today with new pro video workflow features, including new compatibility with Blackmagic Design I/O cards, improved support for media formats including 10-bit video I/O, audio I/O, and 4K and UHD output via SDI.

Unreal will also support interlaced and PsF HD video, improving its performance with widely used broadcast formats.

Unreal Engine’s Blackmagic Media Player, along with its existing AJA Media Player, is not automatically installed with Unreal Engine, but is available via the Marketplace tab in the Epic Games Launcher. The source code for both plugins has been published on GitHub so that developers can learn how to create new video I/O plugins for the platform.

Nearly ubiquitous in videogaming circles, Unreal Engine has rapidly been gaining ground in traditional broadcast workflow as well as in production and post situations (especially virtual cinematography and VFX previs) and live events, where the ability to render high-quality 3D environments and other content in real time is crucial.

Unreal Engine MediaProfile screenshot

Users can now create MediaProfiles on different computers to handle specific hardware and video fromat requirements.
Epic Games

“We’re giving the engine a very robust toolset for video I/O and processing,” said Unreal Enterprise GM Marc Petit during a press briefing highlighting the new features. “A real-time renderer that can understand video streams becomes a real-time compositor, and we’re starting to see people in the event space use the engine to abstract their content from the display.” As an example, he said, a concert tour could dynamically generate video streams depending on the configuration of projectors in use at a given arena, rather than pre-rendering content to fit each venue’s needs.

And Petit stressed the importance of Unreal’s rendering system, which allows massive scenes to be rendered at wildly varying resolutions, by invoking the runaway hit videogame Fortnite. Fortnite is a “battle royale” game where upwards of 100 players participate in a shootout on an isolated island. The island is seen in its entirety at the beginning of the game, from high in the air; players parachute from a “battle bus” and watch the ground get closer and closer until they’re running through the grass. But there are no “cuts” to reload different versions of the island as players get closer; the massive model that’s seen in the distant overview is the same finely-detailed world that players interact with during close-up, over-the-shoulder gameplay.

“The rendering system distances you from resolution,” he explained. “A lot of people want to do nice interactive content on stage, and we provide this seamless architecture where we can render the same scene on your phone. We can take it from super-low resolution on a phone to 24K resolution at Madison Square Garden — the same scene. We’re bringing the magic of the moment back to the stage with real-time tools and big displays.”

To speed up workflow in businesses that repurpose highly detailed CAD models, Unreal Engine 4.21 will have new “auto-defeaturing” tools that ease the process of reducing the size of an object’s geometry by eliminating elements that are included in the original model but that won’t be visible in a given application. “The concept of the digital twin is rising in the industry,” Petit explained. “The truth is the CAD model. We want to fully automate the process of generating a digital twin — a much lighter-weight version of the geometry that looks and behaves the same as the real object.

V4.20 of Unreal added support for MDL, the Nvidia-created Materials Definition Language, and now Unreal Engine is getting improved support for the Universal Scene Description (USD) standard for interchange between computer graphics applications. “We want to integrate into existing pipelines, and USD has become one of our key investments to actually make this happen. Between USD and MDL, we have two great ways to communicate with traditional offline solutions.”

Also important in this version of Unreal Engine are “pixel-streaming” plug-ins that allow geometry to be processed on powerful cloud-based systems, then streamed to lower-powered devices for viewing. “You run your super-sophisticated application on the cloud, and you actually stream the result to a device,” Petit said. “If you have enough bandwidth, you can actually pixel-stream your app in pixel format. If you can do Netflix at 4K, you can stream in 4K, and you just send the mouse information for proper interaction. We think this can considerably ease the distribution of interactive content. You can load a gigantic stadium and have nice performance on your iPad.”

Petit said big manufacturing companies are using the technology to stream complex interactive content for consumption using iPads for reference on the factory floor. And the next-generation promise is that personalized content for advertising could be generated in the cloud and pixel-streamed to individual consumers.

“Everybody’s watching the cost of GPUs in the cloud,” he said. “We are all very confident the cost of graphics in the cloud will come down and interactive content will stream to consumers. Maybe not in 2019 — but certainly people were eager to have that technology integrated into Unreal Engine.”

For a more complete list of new Unreal Engine features, see the official website.

Unreal Engine: www.unrealengine.com