How the Film's Machine-Learning Techniques Pave the Way for More Subtly Expressive Digital Characters

The entire squad of insanely popular Avengers and their superhero allies sacrifice all as they try to defeat the ultimate villain Thanos before he can turn everyone in the universe to dust. Disney distributed the huge Marvel production, which combined wall-to-wall action with Marvel’s trademark humor, ended with three deadly finales, and left audiences waiting for part two.

Directed by Anthony and Joe Russo, who had previously directed Marvel’s Captain America: The Winter Soldier and Captain America: Civil War, the film stars an ensemble cast of familiar Marvel film actors: Robert Downey Jr. (Tony Stark/Iron Man), Chris Hemsworth (Thor), Mark Ruffalo (Bruce Banner/Hulk), Chris Evans (Steve Rogers/Captain America), Scarlett Johansson (Natasha Romanoff/Black Widow), Don Cheadle (James Rhodes/War Machine), Benedict Cumberbatch (Doctor Strange), Tom Holland (Peter Parker/Spider-Man), Chadwick Boseman (T’Challa/Black Panther), and many more. Bradley Cooper again voices Rocket and Vin Diesel voices Groot. Josh Brolin voices the villain Thanos.

IMDb reports that the film had a budget of $321 million and earned more than $2 billion at the worldwide box office. The review aggregate Rotten Tomatoes gives the film an 85 percent approval rating.

Avengers: Infinity War has received Oscar, BAFTA, and VES nominations for best visual effects. In addition, it has an Annie nomination for character animation in a live action production and four additional VES nominations, for model (Nidavellir Forge megastructure), compositing (Titan), animated character (Thanos), and effects simulations (Titan and Wakanda).

A total of 14 visual effects studios contributed to this film. Daniel Sudick (special effects supervisor), Dan DeLeeuw (visual effects supervisor), Kelly Port (visual effects supervisor – Digital Domain), and Russell Earl (visual effects supervisor – ILM) received the Oscar and BAFTA nominations for best visual effects. All four have signed on to the next film, Avengers: Endgame, which the Russo brothers are also directing. DeLeeuw, Port, Sudick, Jen Underdahl (VFX producer), and Matt Aitken (VFX supervisor Weta Digital) all received VES awards for outstanding visual effects in a photoreal feature.

We spoke with Dan DeLeeuw, the overall supervisor for Avengers: Infinity War. In addition to his nominations for this film, DeLeeuw received an Oscar nomination for best visual effects in Captain America: Civil War.

We’ve interviewed all five Oscar-nominated VFX supervisors. Click here to read the rest of the Q&As.

Thanos and Josh Brolin in performance capture rig for Avengers: Infinity War

Thanos on screen and (inset) Josh Brolin in performance capture
Marvel Studios/Disney

StudioDaily: Why do you think your peers voted to give Oscar nominations for best visual effects to Avengers: Infinity War?

Dan DeLeeuw: I think Thanos was a big part of that. The goal when we started was to create the next best movie villain, and that was something we accomplished. We were able to capture so much of Josh Brolin’s amazing performance and put that in the film. Also, we have three big finales: Thor, Iron Man, and Cap have different stories at the end. So, the scope, scale, density, and complexity set this film apart.

Why was Thanos such a successful CG character?

In a strange way, there’s a humanity to him. That’s something Josh brought to him. Thanos does horrible and unspeakable things, but he’s still charming, and the digital version is somewhat handsome. So the audience is complicit in his thinking until he crosses the line by murdering his daughter. And then at the end he gets rid of all your favorite heroes. In the beginning, you’re having fun with him thinking the heroes will win, and then at the end he robs you of that and, at the very end, even your heroes.

Chris Pratt as Peter Quill/Star Lord

Chris Pratt as Peter Quill/Star Lord
Disney/Marvel Studios

How did Thanos evolve?

In previous films, Thanos was more alien. We thought he was something of an archvillain type, broad and boastful. We went through an evolution taking him from a menacing villain to looking at things from his view, of a father and daughter set against galactic tapestry. That required a character who could develop emotion. We went into it with Digital Domain and Weta Digital working simultaneously. [Head of Visual Development] Ryan Meinerding did look dev so both studios could stay on model. We modified the Thanos model to be more like Josh [Brolin] — you can recognize his eyes and the shape of his mouth in the design. When we retargeted Josh’s performance, the data from the helmet camera transferred more quickly and effectively to the model.

We did a motion-capture test early on with Josh to explore the character. Part of doing that test was seeing how subtle a performance we could get. We wanted Josh to say a particular set of lines, but we let the motion-capture keep running and ended up in a subtle, introspective place we hadn’t anticipated. Digital Domain took those lines and did a test that knocked everyone’s socks off.

Thanos and Josh Brolin

Thanos on screen (left) and Josh Brolin in performance capture
Disney/Marvel Studios

Was any part of the process of creating Thanos new and inventive?

The key thing about Thanos, the thing that elevated the character, was something Digital Domain had been working on with machine learning. The facial capture process is that you have Josh perform a scene with dots on his face to capture the movement of features or regions of a face. The computer recognizes those dots and uses them to figure out what the face is doing. Because there are a limited number of dots, we usually have to interpret and apply our interpretation to a character like Thanos. But we transcended that number of dots.

How did you increase the resolution of the facial capture?

Subtle movements aren’t captured with the lower number of dots, but we can now do high-resolution scans of actors’ faces to a pore level. So, we used machine learning to teach the computer how to recognize Josh and his performance and apply the data from the dots to a pore level. We were able to get amazingly detailed facial movement with awesome skin compression and expansion that brought the face to life.

Did you test the result on a model of Josh Brolin’s face?

Digital Domain went straight to Thanos. Weta is subtly different — they go to Josh’s face first.

Did Josh’s matching digital facial performance land inside or outside the uncanny valley?

We didn’t push too hard, but I think this is the tool that will allow us to get there in the next couple of years. But again, the character’s performance is always driven from the actor’s performance. Without that performance, no amount of science and technology will get us out of the uncanny valley.

What about Thanos’ CG henchmen?

For his Black Order, we did the typical process of casting actors, motion-capturing, and recording facial movement captured with a helmet camera. Those characters were shared by many vendors.

Spider-Man

Spider-Man was among the familiar superhero characters featured in Avengers: Infinity War
Disney/Marvel Studios

Were you the keeper of continuity?

A big part of my day was looking at shots from the visual effects houses. Having characters shared among the houses adds an extra layer of complexity, but it also means we’re working in an environment with a creative group of people. Each studio innovated and created something special. We’d send those shots to the other studios.

How did you share the shots?

If the characters were still sculptural, we’d send geometry and the houses would work from that. If the character is texture- and paint-based, at some point once another studio inherits the character it will diverge in terms of what’s needed. Maybe the character is now in a night scene, for example. So early on we share textural quality and shape. Later we send play blasts of animation or actual shots. Then, we maintain continuity as best we can. And we’re smart about how we cast the scenes. We might not see the character for a while, so if it’s a few percentages off, no one might notice.

Many of the CG characters have appeared in previous Marvel movies — Iron Man, Hulk, Groot, Rocket, and so forth. Can you pull them from earlier films?

The great thing about Marvel is that every character, every set from every film is online. If you want to go back and look at the Avengers compound, you can load the different scene files. Then, you’re on your own to plus it as best you can as the technology changes and advances.

I’ve read that this was the most expensive movie ever made. Was it?

I don’t know, but I know it’s one of the biggest movies ever made. There are 2,623 shots with visual effects work. The movie takes place on alien planets and in Greenwich Village, not in Atlanta where we filmed, so each of those scenes required changing and adding everything in the backgrounds. There are only 80 shots that we didn’t touch.

Thanos and Iron Man

Thanos (left) and Robert Downey Jr. as Iron Man
Disney/Marvel Studios

What was the hardest thing about supervising the visual effects for this film?

The size and scope. When we went into the film, Thanos was our main concern. We had to solve Thanos. If he didn’t work, the movie wouldn’t work. But the innovations solved that problem pretty early. We were confident and happy and Thanos got more scenes and lines. After that, it was scale and scope. Any of our three finales could be the finale of a single film. To manage that and the 14 houses and get it all done on a short schedule was hard. But everyone jumped on board.

How short was the schedule?

Shooting finished in August and the movie came out in April, so about six months. We did start turning over some CG shots while we were shooting.

What was the best thing about working on this movie?

I’m a comic-book fan, so being able to do some of the things we were able to do with these characters that I’d read about as a kid. And seeing this movie with fans and watching them react in the way I react was very special to me. You go into something and design a scene and hope it works and then see it with an audience — when Thor arrives in Wakanda with Rocket and Groot and the audience goes insane. Seeing the audience getting into it and believing the characters is the most rewarding thing.

What will you apply from this film to other films you work on?

I think where we took the digital characters on this film was great, and a lot of what we’ve done in this film will open up new ways to tell stories with digital characters. But every film prepares you for the next. I started as visual effects supervisor at Marvel on Captain America: Winter Soldier, then did Captain America: Civil War. Each film prepared me for the next. I could never have done the next one without that progress. It’s like going from high school to undergrad to doctoral.

Your next film is Avengers: Endgame. What can you tell us about that?

It’s awesome. Beyond that, we’re keeping everything tight to the chest. It’s like when your kids want to know what they’ll get for Christmas, but they’re so excited you don’t want to tell them.