On That SIGGRAPH Award-Winning Demo of Real-Time Virtual Characters and What It Means for Film and TV

U.K. game developer Ninja Theory is launching Senua Studio, offering services for creating virtual character performances in real time. The technology was demonstrated as "real-time cinematography" at SIGGRAPH's Real-Time Live! presentation, where an actor was performance-captured and rendered on stage in front of a live audience. The demo, a collaboration between Ninja Theory, Epic Games, Cubic Motion, and 3Lateral titled "From Previs to Final in Five Minutes," ended up taking the award for Best Real-Time Graphics and Interactivity. Watch the demo, below, then read our interview with Ninja Theory co-founder Tameem Antoniades.

StudioDaily: What technological developments made the SIGGRAPH award-winning real-time cinematography demonstration possible? Recent advances in software, GPU hardware, or a combination?

Tameem Antoniades: There were several advancements by various partner companies that made it possible. We took our actress, Melina Juergens, to 3Lateral in Serbia to create a digital double of her using hundreds of facial scans. From that, they created one of the most advanced real-time facial rigs in the business. Cubic Motion, based in Manchester, U.K., created the computer vision software that could solve the facial performance of Melina in realtime and drive the facial rig. Epic, makers of Unreal Engine, pushed forward the complex rendering shaders to represent realistic skin, hair, eyes, wrinkle maps and lighting. Their new Sequencer tool was crucial in that it allowed all of the data including face, voice and body to be captured in real time. There were also others involved such as Technoprops, Vicon, House of Moves, IKinema, and Nvidia at various stages. Our role was to provide direction and art for the piece, using Hellblade as a case study. In the end, it was a very deep collaboration over many weeks and months to get the whole thing working. I think it points to a future where no one company can easily perform all of these specialist functions and expect to get them working together. We now see our own R&D efforts as the creative glue that can help bind the whole.

Are there limits to the number of actors who could be captured performing in one scene at the same time?

The limits are mainly costs, as you would have to create several digital doubles and each actor would require hardware for running the facial solvers. But in theory you can scale it up within reason.

Why is now the right time to start a division dedicated to real-time virtual characters, and what kind of clients do you see as the most likely early customers?

When you witness a live-driven character in person, it truly is a magical thing. I get goosebumps each time I see Melina driving Senua and seeing her respond to me and others. The last time I felt something like this was when I saw Pixar’s "Red’s Dream" and then Toy Story. As far as I know, no one else has got this working to such a level and I think it’s important technology that we apply well. I think it opens up the idea of doing live CG theater and music performances featuring giant holographic avatars that, unlike Gorillaz or Hatsune Miku, can interact with an audience. I also think it would suit a visionary director for CG movies who wants to see everything live on set without having to “fix it in post.” For TV, animated series can be shot in real time in record time instead of rendering things out over weeks and months. And VR is where my head is spinning at the possibilities.

Will Sensua Studios offer creative services as well, or just technology and support?

We are a creative studio and that is our main function. We’ve been making high-end, AAA games for 16 years and so can bring our art, storytelling, production and technical flair to make beautiful things happen in what is still a technically complex endeavour. So we’re looking for ambitious partners who want to create the extraordinary and break new ground in entertainment.

Do you anticipate that advances in graphics technology will accelerate creative cross-pollination between the videogame industry and film, TV, and advertising?

The industries are fairly entrenched at this stage and collaboration is likely to be limited, but fruitful where it happens. The one area where everyone will be forced to collaborate will be VR, as real-time technology, not video, is absolutely where VR will succeed, and digital humans are going to be at the heart of it.