Faceware Technologies said its new Faceware Live plug-in would allow developers to instantly apply facial motion-capture data to characters in the Unreal Engine from Epic Games.


A performer's facial movements can be captured using a webcam, a dedicated facial-capture camera system, or other camera, and then stream movement data through Faceware Live and into the Unreal Engine. The plug-in leverages the Unreal Engine's Animation Blueprint visual scripting system to drive facial animation in real time.

Faceware Live was co-developed with Melbourne, Australia's Opaque Multimedia, which also developed Microsoft Kinect integration for Unreal 4. Faceware said it can be used to generate facial animation for live events, or for quickly generating animation for previs.

"Faceware Technologies has a long history of creating some of the most iconic and realistic faces in games and films, while Epic's Unreal Engine is know for helping create some of the best-selling games ever," said Faceware VP of Business Development Peter Busch in a prepared statement. "Integrating our real-time technology with their premier game engine was just a natural fit."

The first projects using the plug-in are expected to appear later this year, Faceware said.