Studying Andy Serkis, Leveraging In-House Tools, and Taking Performance Capture on Location

And the Oscar nominees are … 20 visual effects, special effects and animation supervisors who represent the thousands of artists who worked on five live-action feature films: Captain America: The Winter Soldier, Dawn of the Planet of the Apes, Guardians of the Galaxy, Interstellar, and X-Men: Days of Future Past.

We start our coverage with Apes, which picked up three Visual Effects Society Awards last week in the categories of Photoreal/Live Action Feature: Outstanding Visual Effects, Outstanding Performance of an Animated Character, and Outstanding Compositing. The three wins put Apes in pole position for the Oscar race. Weta Digital created the visual effects.

The film takes place in an apocalyptic future some years after Rise of the Planet of the Apes, which also received a visual-effects Oscar nomination. Caesar now leads a growing nation of genetically evolved apes who, during the film, are threatened by a band of humans who survived the plague. The apes are CG-rendered, writes The Atlantic’s Chrisopher Orr, “with such nuance and sophistication that it is easy to forget they are constructed out of pixels.” 

“We witness the apes conversing with a combination of sign language, grunts, and occasional English words,” Orr writes. “We’re treated to a hunt with primitive spears, to the birth of a chimplet, and to a near-death by bear mauling. Mostly we just watch in awe as the apes lope majestically among the trees of Marin County. It’s actually a bit of a letdown when the human beings eventually show up.”

Matt Reeves directed the Twentieth Century Fox film. It received a 90% approval rating on Rotten Tomatoes and has earned $702 million worldwide.

Weta Digital senior visual effects supervisor Joe Letteri, visual effects supervisors Dan Lemmon and Erik Winquist, and animation supervisor Daniel Barrett received the Oscar nominations for Dawn of the Planet of the Apes. They led a team of approximately 850 people who created all the apes, the digital set extensions and environments, and the effects for the film. Lemmon was the main unit visual effects supervisor, Winquist handled pre-production and second-unit photography, Keith Miller supervised the climactic battle sequence, and senior visual effects supervisor Letteri overlooked the entire project — as well as all projects at Weta Digital. We talked with Dan Lemmon, who was on the film from January 2013 until it wrapped in June 2014. This is Lemmon’s second Oscar nomination; he received his first for Rise of the Planet of the Apes.

StudioDaily: The 10 films up for an Oscar nomination at the bake-off this year gave you some stiff competition. Why do you think your peers voted for Apes?

Dan Lemmon: The thing about both Apes movies is that they are so character driven, especially this one. The advances we made in technology and artistry are less about explosions and spectacle and more about what we’ve done to allow characters and actors playing characters to connect with the audience at a level we’ve never been able to achieve before. What makes these films unique and special is how much acting our digital characters do and how well they integrate into the rest of the movie.

Tell us about the advances in technology.

We did a lot of things to take the performance-capture tools out of the dedicated capture stage and on location. We made them more portable, and we streamlined the process of setting up and calibration. We did a lot of capture in the rainforests near Vancouver, and the trees were all protected, so we had to come up with ways to attach the cameras in difficult places without damaging the national forest. 

We also had to make everything more robust, more bulletproof, and — especially on this film — waterproof. We had rain and rain bars. The equipment had to be mobile. We went up to places on the sides of mountains and through knee-deep mud, places that were hard to get camera gear and even actors to, so we had to condense our big motion-capture setup and make it more portable.

Did you use faux cap [mo-cap using standard video cameras rather than motion-capture cameras]?

In a few cases. We tried to keep the helmets even if we went to faux cap for bodies. But the performance is king, so if the equipment was in the way, we lost the equipment. In cases where the apes make head contact or do stunts that would be dangerous with the boom mikes on helmets, we’d take them off and do soft cap.

Were the cameras new?

They’re all new, higher resolution. And cables are an enemy in certain places, especially on set, so one-third of the motion-capture cameras were wireless. Standard Deviation makes the cameras, the housings, and the sensors. We’ve worked with Babak Beheshti there since Avatar. We spec out our ideal and Standard Deviation puts together the components. We custom-fit the helmets for the actors’ heads. We changed the mounting technology for the helmet cameras, and the new cameras have higher resolution and contrast response and work better in low light.

Babak also makes our facial-recording packs, the Velcro-mounted computers that attach to the helmet cams and record actors, and marker control packs. The markers on the actors’ bodies have to pulse when the motion-capture camera shutter is open. We do the same thing with our witness cameras and keep them in sync with the motion-capture camera as well. He has packs that keep everything in lockstep. 

Are the markers used on the actors’ bodies new? 

As for the first film, the active markers are LEDs that flash bright enough to capture motion in sunlight. On the first film, it was bleeding-edge technology — we made it up as we went. We used speaker wire [and] gaffer tape, and they were fragile. We’d lose [data for] an arm or a leg during stunts. For this film, we developed new rubberized casings and hard plastic lenses so the actors could do what they wanted and the markers would stay on.

How did the new technology affect live-action production?

Having the whole system put together in a way that was relatively unobtrusive and quick to set up meant we could allow the actors playing the apes to be in the set working with the other actors playing the humans. They were in the immediacy of the scene. They could figure it out with the director, make suggestions, and try things out in the context of a live action film set.

Can you give me an example?

There is a scene where Koba confronts the two humans. Toby Kebbell [the actor who plays Koba] improvised that scene on the day. He wanted to play it like a bar fight where someone is friendly and then all of a sudden violence explodes out and we realize he had bad intentions all along. When they tried it on set, the way Toby picked up the gun and [how] the humans reacted happened all in the immediacy of the moment. Traditionally for CG characters on set you have a tennis ball for the eyeline and you don’t get those spontaneously realistic reactions.


Were actors captured on stages as well?

We did some on stages and some pickups. There was a scene where Maurice is teaching letters to the children chimpanzees and Caesar returns from a hunt. [Director] Matt Reeves wanted to have performances from real children. So we took Lidar data from the set and created an apple-box version of it in the performance capture volume. We had Karen Konoval, who plays Maurice, and Terry Notary, who plays Rocket [and other apes], who would play a child ape. Then we brought in a bunch of kids and put them in little motion-capture suits. We had Matt Reeves' son and Terry’s daughters, Willow and Sky. Willow and Sky had trained with arm extensions and they were really good juvenile chimpanzees. We used their data all over the place.

Did the studio develop new technology to help bring CG characters alive?

We have a great new real-time lighting preview tool called Gazebo for blocking, lighting and reviewing rough shapes. It plugs into Manuka and RenderMan and gives us a pretty good picture of what a final scene will look like. Manuka is our new path-tracing renderer. We used it mostly for the wide shots and big crowd scenes with all those furry apes. It manages complexity really well. 

We also had a number of advances in tissue simulation. In Rise, our tissue simulation was still in its infancy. We had ideas about running skin and muscles on top of animation to allow muscles to wrinkle and unwrinkle and it helped. But we had issues with stability and it took a lot of work to get a solution.

So we overhauled the way our tissue simulations are set up and the way they add to the animation without confusing the animation. And we pushed it further. Maurice has a big wattle of skin below his neck and the sims were better for that.

We also made big advances in our hair-grooming technology. We completely rewrote our software to give us better control over hair styling. The hair can be wet and damp in some scenes, and clumpy. We had about a dozen different hair grooms for Caesar. All those different fur versions were possible because of the speed and ease of Barbershop.


Caesar is portrayed by Andy Serkis (on horseback in performance-capture suit). Photo by David James; courtesy Twentieth Century Fox Film Corporation

Hair, muscle tissue, new camera technology, new renderers — a lot more goes into creating Caesar than capturing dots of data from Andy Serkis. Tell us about the animation artistry that adds to Serkis’s performance. 

There’s a bit of confusion about how much the technology does and how much the animators do. The really important thing is that Andy was the author of the performance. He was on set making the decisions. The hardest part of our job is to study what he did, to figure out what he does with his face and body that creates an emotional reaction in the observer, and how to get that onto the digital character.

The computer gives us great data that tells us when things move. We have all these dots on his face [and] the camera on his helmet, the witness camera that tells us when and what parts of his face move. But dialing in that expression to carry the emotion requires a lot of human attention. 

The faces are so different. If we put Andy’s lips on Caesar, Caesar wouldn’t look like a chimpanzee. Our artists have gotten better over the years studying how to translate Andy’s face onto Caesar’s face. What to keep and what to leave out to change the emotional beat.

And there are other things, as well. We don’t capture fingers or toes. We don’t capture the tongue, which is really important, and details for the eyes.

You don’t capture the eye movement?

One thing that’s easy to track is eyeball movement. The face cameras can see in great detail how the eyes move, how they track the scene, all the little vibrations and adjustments. But capturing the movement of the skin around the eyes and the eyelids is tricky and that’s almost more important in getting the expression right than getting the eyeball movement. It’s hard to track directly. It requires animator attention. The dots give the timing and the way movements overlap and those curves, and that raw data is useful. But getting the right shape requires human attention.

Also, when Andy gets emotional, his eyes start to turn red, he starts to tear up, and sometimes he sheds tears. That’s something we have to observe and add by hand. He can get spitty when he gets excited as well, which we add.

What did working on this film teach you?

I’m always learning new things about faces and performances. Toby was amazing in terms of what he brought to Koba. Every time there is a new scene and emotional situation, you learn things about acting choices that contribute to the audience’s perception of a character. That’s the stuff we find hard to get onto digital characters. So it’s a constant education.

I also spent a lot of time with chimpanzees, gorillas [and] orangutans, learning what makes them so interesting and captivating to watch. There really isn’t a lot that separates us from them. 

Rain was new for us in this movie. So we studied how apes in the wild to see how water interacts with their fur to make our apes as realistic as possible.


You mentioned shooting in Vancouver rainforests. What environments are digital?

Half the movie is set in San Francisco, but we filmed only a handful of shots there. And for those, we replaced most of it because it was pristine. We added vines and degradation and rebuilt everything so it looked 15 years into an apocalypse.  

We shot most of the scenes on a blue-screen set in New Orleans where we had two blocks square — an intersection plus a block in each direction. But it is amazing and slightly sad that a lot of the buildings were already abandoned and in various states of degradation. So we added vines and they were ready to go.

We built a giant ape village environment and surrounded it with movable blue screens in an abandoned Six Flags Amusement park. It wasn’t rebuilt after Katrina, so it has become a film set. 

And forest environments in British Columbia, which stand in for Northern California and Muir Woods, had their own challenges. Rain is one of the trickiest things, and getting the interaction between the apes and the ground cover was a challenge. We tried to tie down or avoid as much of the bushy and grassy stuff on set as we could so we could add it back in later digitally.

What’s the best thing about receiving an Oscar nomination?

The greatest thing is to work on projects that bring characters and environments to life and take people places they otherwise couldn’t go. Being able to do that as part of my job, and be recognized by my peers, is the coolest thing. I love my job.

What are you working on next?

I’m working on an unannounced project. I’ll be doing more monkeys.