Stargate Studios Founder Sam Nicholson, ASC, on the Film's Rigorous Combination of Real Puppets, Mocap and Motion Control Work

Stargate Studios founder Sam Nicholson, ASC, began his career by spearheading a kinetic lighting that brought the starship Enterprise’s warp engines to throbbing life for Star Trek: The Motion Picture. Towards the end of that same troubled shoot, when production was stymied by the lack of an approach to depict the film’s “living machine” antagonist, Nicholson and his associates worked up a multi-tiered lighting plan — involving conventional units plus nearly every one of the then-new HMI lights in Hollywood — on an immense scale that delivered a colorful language of light, creating a practical in-camera basis for the visual crescendo that serves as the feature’s climax.

Since that time, Nicholson’s career in visual effects has included pioneering virtual backlot set extensions, beginning with the little-seen Mafia! and featuring prominently in the realization of ABC’s period drama Pan Am. He founded Stargate Studios in 1989, and the facility has flourished in recent years, expanding from a single Los Angeles base to offices in Atlanta, Cologne, Dubai, London, Malta, Mexico City, Toronto and Vancouver. He has served as VFX supervisor on numerous TV projects, earning Emmy wins for Heroes and Nightmares and Dreamscapes — the latter for the “Battleground” segment directed by Brian Henson, with whom he again collaborates for the filmmaker’s current feature, The Happytime Murders. This comedy employs the traditional ingenious puppetry seen in various Muppet projects, but augments that with digital work that quite literally gives these tough-as-felt characters a leg (or two) up when they are out on the mean streets of L.A.

Watch a gleefully over-the-top red-band trailer for The Happytime Murders — but don’t show the kids!

StudioDaily: It has been over a decade since you worked with Henson on “Battleground.” How different was it when tackling The Happytime Murders?

Nicholson: On “Battleground,” all of our characters were supposed to be three inches tall, whereas these puppets are closer to three feet in height, so the scales and textures are much different and more difficult to pull off. Even understanding the way light falls on and is absorbed by felt and replicating that had its own learning curve. And. owing to how the times and technologies have changed, we’re using a lot more in the way of [digital] avatars. Plus, we’ve been associated with this project for a long while now, waiting with Brian for five years, testing and hoping he would get the green light for his very unique vision.

Was there a clear delineation up front about what would be achieved in-camera and which elements were destined to be visual effects, or was that situation fluid as production progressed?

We wanted to do as much as possible practically, rather than rely on CG as the go-to solution. Brian is probably better-versed on the subject of puppets than anybody else on the planet: how they move and, perhaps more importantly, how their movement differs from that of humans. But good puppeteering is an enormous challenge in even the most ideal conditions, since there are sometimes as many as four or five puppeteers working on a single puppet. So as you can imagine, when the script reads “Exterior Santa Monica Beach — Day” and Brian tells you he wants puppets everywhere — digging through trash cans, riding scooters, lifting weights, shopping and singing — that really suggests an expansive approach, one going beyond what could be possible with strictly practical solutions, even if you had an army of puppeteers. So it made good sense to shoot those puppet performance elements against green-screen, but match-move them into production’s moving camera location plates. It was kind of tricky to reverse engineer the moves to get our green-screen stuff to match the perspective on a 40-foot Technocrane shot; after all, it isn’t just the move, it is also the lighting. And figuring things out ahead of time fits right in with our philosophy, to strive toward fixing things in prep rather than waiting till post.

So how does motion-control fit into this approach?

Joe Lewis of General Lift supplied our motion-control system, so we were able to use that to recreate and match the location crane move while scaling it so our puppets could appear to be 20 feet from camera or a block away. And we could generate 20 puppet elements per day with this approach, which would be impossible while using CG.

A scene from <i>The Happytime Murders</i>

For some shots of walking puppets, a practical puppet torso was combined with CG legs derived from mocap data.
Justin Lubin/© 2018 STX Financing, LLC. All Rights Reserved

But for certain shots, the characters had to be created via CG?

Sometimes there was a full CG character, derived from scanning with full texturing, as well as some that utilized photogrammetry. Other times we used a hybrid approach, combining a practical puppet torso with CG legs derived from mocap data. But keep in mind, while you can compare The Happytime Murders with Ted and other movies that relies on CG characters driven by humans in mocap suits, we wanted to retain that distinct real-puppet feel of the performance throughout. We found that it is incredibly difficult to take a mocap file and make it really look like a puppet, because you always see the human basis inside the capture. So we’d look at each scene and make a determination about whether puppets against green-screen would be an applicable solution, and if not, then consider the CG solution. Going back and forth between techniques is a very efficient way to do things, but it is also very effective creatively. You could never have done this film in a convincing way throughout by relying just on the computer approach. No matter how much firepower you throw at digital characters, I believe they lack something about them, a soul if you will, that keeps you from believing they are real. Whether it is a muppet or Princess Leia — and that’s not a criticism, just an observation — there’s some kind of giveaway with digital that works against audience acceptance. That’s why I find the quality of Bill Beretta’s puppeteer performance to be so integral to that acceptance as he portrays the Phil Phillip’s PI character.

Can you elaborate on the hybrid approach?

We mocapped the torso of the puppets and the legs of the puppeteers, then combined them, which gave us the unmistakable puppet feel, but with a human relatability in the walking. It’s an optimum blend of traditional puppetry, but using mocap, motion control and CG to take it to a new dimension.

So you had to scale the full-sized human leg movement to the actual size of the puppet?

That was the neat trick. We’d remap the puppet torso capture and connect that to the human legs, then scale them appropriately. That would get you a walking puppet avatar. There’s a lot of technology behind it. We wanted to take a genuinely purist approach to seamless integration of real puppetry and avatars and green-screen puppets. During testing, we tried optical mocap, but ended up using inertial mocap on set with Xsens capture suits. There can be issues with the inertial system. You do have problems with proximity to metal, which can throw off the suits and cause drift. We reconstruct the code in [Autodesk] MotionBuilder in order to see how the suits combined into a single live avatar or character on set. When you’re breaking new territory, it opens up all kinds of possibilities for performance, but since you’re guaranteed 97% of the audience is going to be looking right into the eyes of these puppet characters, you can’t afford to blow that illusion by distracting away from the puppeteer’s performance just to suit an easier way of doing effects. What’s fantastic about this is that you’ll come away from it thinking, ‘That’s a real puppet doing this.’

You can’t see them, but the puppeteers are very much present in The Happytime Murders.
Hopper Stone/© 2018 STX Financing, LLC. All Rights Reserved.

The puppeteer’s performance forms the basis for how the audience relates to what’s going on?

As much as I love visual effects, ultimately it is all about performance. That organic connection to the characters provides something that all these computer-generated films may have sometimes gotten away from. They all become flat or spiritless; but this movie has got that spirit and then some. The truth of the matter is that you can’t do ad lib with a computer graphic. There’s a tremendous variance between take one and take 10 with Melissa McCarthy. When you’re so encumbered technically that it takes weeks to do something, that is antithetical to comedy, which is a real-time process.

While shooting on stage, were traditional builds used to facilitate puppeteer access to the puppets, from beneath built-up flooring?

All the sets were built up 4 feet so the puppeteers could be underneath, but outside in the world, you need to accommodate the puppeteers while hiding them or sustaining the illusion. You can’t put a puppeteer under the street pavement, so there are limitations when it comes to hiding them, or providing the right access. So there’s always a ton of cleanup going on, because even if some puppeteers are off-camera, others remain visible, along with apparatus. So that all builds up to a surprising amount of VFX, getting close to one thousand shots.

Scene from <i>The Happytime Murders</i>

Feathers fly in some of the film’s most outrageous scenes.
© 2018 STX Financing, LLC. All Rights Reserved

How about the slaughter scenes? Were those mostly achieved practically?

Yeah, there are some pretty big gunfight-at-the-OK-Corral scenes [laughs], with puppet stuffing flying everywhere. The set was just so much fun, because Bill and the other puppeteers are so funny while remaining in character. It is really outrageous, as is the movie itself. It has the courage to where no film – and certainly no puppet film – has gone before. In today’s market, that’s really refreshing, to see something that is willing to be such a departure.

Was dealing with the volume of data on this film ever an issue, given that capture was handled at 8K?

Light Iron has been very good with how they handled the workflow into post. We built our own 8K pipeline for dailies. We get everything in 4K, and the output was always going to be a 4K DCP, though we did certain shots in 8K. Also, when shooting our own green-screen material as well as plates, we shot on Red Weapon cameras, which gave us a look that worked with production’s Panavision Millennium DXLs.

Behind the scenes of <i>The Happytime Murders</i>.

Puppeteer Bill Barretta on the set of The Happytime Murders.
Hopper Stone/© 2018 STX Financing, LLC. All Rights Reserved

I recall speaking with you in the late 90s and your excitement over developing an approach to set extensions. Stargate has really built that into something over the last couple decades, especially on Pan Am. I was amazed at the volume of shots you were able to turn around in such a limited time frame.

We’re still doing virtual backlot, though the tools are better and improved, so the process is more immersive rather than being used strictly for deep background. On Happytime, we did use [Lightcraft] Previzion, [SolidAnim] SolidTrack, and a new proprietary system with targetless motion tracking, as well as [Blackmagic Design] Ultimatte. We’re always exploring additional options to provide context for a performance so that we’re not just looking at a green screen, plus evaluating how the element will integrate with the environment and to make the composition as pleasing and effective as possible. There are a couple scenes in this movie with 100% computer-generated sets, which let us do any kind of camera move we wanted. This was the case when Phil is getting sucked into the aircraft jet engine. All of the virtual sets got built during prep. We used the Unreal Engine to drive a background that is 50% of the way there on set, so the actors and puppeteers could see what was happening, and then the editors would get this rough precomp.

What lies ahead for Stargate?

We did modeling for the pilot of The Orville, and currently we’re doing Nightflyers [set to air on Syfy in the U.S. and via Netflix elsewhere], based on the George R. R. Martin story. Rather than relying on green-screen, this one features a lot of in-camera compositing, using 70-foot LED screens, which is very exciting. Innovative risk-taking seems to be taking place more in TV than film in a lot of cases these days.