Ncam, the maker of technology for tracking a freely moving camera in real time, plans to demo photorealistic compositing techniques at NAB. By tapping Epic Games' Unreal Engine to help create photorealistic previsualizations using camera-tracking data on set, Ncam says its new technology can help directors and DPs more shoot VFX-heavy scenes more quickly and confidently.

"It is clear from discussions with customers, studios and productions, especially around episodic television, that there is demand in augmented reality for set extensions, virtual environments, previsualization and finished visual effects," said Ncam CEO Nic Hatch in a prepared statement. "We are now combining our camera tracking with Ncam's new relighting and depth technology, delivering … photorealistic augmented reality."

With real-time photorealism incorporating views from multiple cameras, the company is aiming to increasing post-production efficiency by allowing graphics elements to be added to a shoot in real time.

Another demo will show a relighting technology for modeling the light in a scene in a way that allows CG elements to appear to cast shadows on real objects in the scene. Ncam says the relighting system will respond to environmental light changes in real time. And Ncam's new support for depth data output is meant to allow broadcast presenters to seem to walk around and through virutal graphical environments, rather than just appear in front of them.

Ncam wil have its own NAB booth for the first time this year (C10345), and its system will also be on display at Vizrt (SL2417), Orad (Avid booth SU902), and Brainstorm (SL4617). 

Projects that have utilized Ncam technology include Avengers: Age of UltronJupiter AscendingEdge of Tomorrow, and ESPN's Monday Night Football and X Games., The London-based company recently opened a Los Angeles office headed by Vincent Maza, formerly of Aspera, Dolby and Avid.