The Film's 'Trash Mesa,' Created in Montreal, Features Crowds of Scavengers Animated in London and Streamed via Vicon Shogun and Unreal Engine

Blade Runner 2049 action scene with crowds posed an interesting challenge for Framestore in Montreal. The VFX facility, which was making its way through 300 VFX shots for the mammoth SF project, needed to generate mocap data for figures that would populate wide shots in the sequence known as the “Trash Mesa attack.” (The film’s protagonist, played by Ryan Gosling, crash-lands his spinner on a beach littered with the skeletons of wrecked ships and other detritus; a gang of well-equipped scavengers quickly descends upon him.) Framestore had a mocap stage ready to go — but it was on the other side of the Atlantic Ocean, in London. Still, Framestore set up a 4×5 meter volume with two mocap performers and 16 Vicon cameras. The data was captured in real time via Vicon’s flagship Shōgun software, streamed into Unreal Engine 4, and then sent to Montreal for review via a dedicated transcontinental hook-up on Framestore’s network. We asked Richard Graham, studio manager for Framestore’s Capture Lab, about completing the job, and what’s in store next for global virtual production workflow.

Blade Runner 2049

StudioDaily: Tell us about how and why the different teams working on the Trash Mesa attack sequence were geographically dispersed.

Richard Graham: The Framestore capture team and mocap stage is based in our London office. The VFX team for Blade Runner were all in Montreal. Our mocap set-up has been in London for the past six years; Framestore’s VFX work was done entirely at Framestore Montreal.

What specific challenges did this scene present?

From a mocap point of view, it was very standard crowd work for us — nothing out of the ordinary. Shōgun gives us a great real-time solve that we use a custom plug-in to stream in Unreal Engine 4 [UE4]. It was only the second time we had used UE4 for our real-time representation; the first was on Kingsman: The Golden Circle, about a month before. We took assets already in our film pipeline and quickly retextured them (mostly procedurally) in Substance Painter since UDIMs don’t work in UE4 and the textures wouldn’t carry over. All the mocap was solved for post using yet more custom pipelining using Shōgun Post, [Autodesk] MotionBuilder and Ikinema Action.

What gave you the confidence to say it would be possible to do intercontinental motion capture?

We already stream pictures between suites in different sites every day for reviews on multi-site projects, so we simply hooked into that set-up by running a cable into our mocap stage.

So how did the different facilities work together in near real time?

In Montreal they viewed our UE4 scene in real time on a 20-foot presentation-suite screen, and we used Google Hangouts to speak with and see each other. Direction was given verbally to the performers in London from the comfort of a leather arm chair in Montreal.

Were there any critical hiccups along the way, or did any specialized solutions need to be created to hook up Montreal with London’s mocap performance data?

We did not stream any mocap data, just the pictures from UE4. No hiccups since we use dedicated bandwidth between sites and a hardware coder-decoder to ensure the pictures are good quality and low latency. We shot for 3.5 hours without any issues other than the usual dropouts and glitches from the video conferencing side.

What sorts of changes are in store for production and post as this kind of virtual production continues to expand?

We’ve already done this same process on an upcoming project and we’ll be doing it again soon. Time zone differences aside, it does open us up to potential work from any location in the globe. It should be possible, in the next six months, for us to do virtual camera, mocap performance and facial performance all at once using this same technique. In the sense of making linear entertainment it would be no different than trying to direct that remotely — probably not what you want to do for a whole film, but certainly feasible if geography or location is your constraint.

Blade Runner 2049 is available now on digital streaming platforms and will be released on DVD and Blu-ray January 16.