In a TV First, The Simpsons Had an Animated Character Taking Phone Calls from Viewers

Update 05/26/16: Since this story was published, we've learned that face-tracking was not used to automate facial animation for the live broadcast of The Simpsons. Adobe responded to our follow-up question with this clarification on the methodology: There was no camera or motion tracking on [Dan Castellaneta]. They wanted the actor to be free to focus on taking calls and improvising answers without worrying about his movements. Homer has things like specific blink cycles and movements that may have been inconsistent with their typical animation style if they used face tracking, so David Silverman (consulting producer/animator) triggered all of Homer’s movements from the X-keys (keyboard) triggers to keep their look and animation style consistent.


Taking the live-TV trend to counter-intuitive lengths, The Simpsons made history last night by having an animated character — Homer Simpson, no less! — conduct a live Q&A at the end of a prime-time broadcast. For about three minutes at the end of this Sunday's episode, Homer appeared in the Fox Studios Secret Bunker to take calls from the viewing audience. Remarkably, he looked just like Homer Simpson, even though voice actor Dan Castellaneta was performing him live. How'd they do it? The animators used Adobe Character Animator, software that's been available in a preview version as part of Adobe Creative Cloud since last year. (If you download After Effects CC, you get Character Animator along with it.) We asked Adobe's Bill Roberts, senior director of video product management, for some details. 

StudioDaily: What was the genesis of this? Did The Simpsons find Character Animator on their own and think they could make it work, or did you approach them about using it for something along those lines?

Bill Roberts: The nice thing was that it was them who came to us. It would be pretty bold of us to go out to as The Simpsons to be one of our first flagship customers. But when they saw what Character Animator could do, they started poking around and talking to the team. They liked that it wasn't something that could just drive the mouth poses. They wanted to create a bunch of other triggers. Think of them as loops in traditional animation. They liked the level of control they could dial in to this to deliver the experience that they wanted. So our team has been working with them, helping them push it farther and farther. From the intitial contact to what they did last night, they ended up doing a lot more with the product than we initially anticipated.

What was involved in the live production? Was it just Dan Castellenata sitting at a computer and looking into a camera, or was there more involved in getting the results to look exactly like The Simpsons?

Their biggest thing, once they realized they could do it at all, was seeing how close they could get to the animation style of the regular program. With the controls you have in Character Animator, you can use phonemes, which are sounds, or visemes, which are the mouth poses, or a combination of the two to drive the animation. For this one, they balanced some keyboard triggers with the particular mouth poses, which were driven by the camera. I don't want to go into too much detail because it's like their own special sauce on top of our systems. But the big thing for them was that combination of automated and keyboard-triggered animation. What you saw was a combination of Dan driving the voiceover live and the team around him driving some other animation using the keyboard.

OK. But the main thing we see happening — Homer Simpson's mouth animations animated to Dan Castellenata's voice in a way that is very much in exactly The Simpsons style — is really just a basic capability of Character Animator, right?

Yes, and I think that's one of the things that the [software development] team got right — having the ability for people to set specific mouth poses that allow people to maintain the style of animation they want. That's what they were really pleased about. They could make Homer talk, and get Homer to look — if you compare the live to the hand animation, you'll see that the mouth poses are the same for all the sounds, allowing them to get the fidelity they wanted, driven in real time by a combination of live-tracking and keyboard-driven animation.

It feels like that's a crucial step in getting pro animators to take this technology seriously. They need to be assured that they can makle the software work in whatever style they want. 

That's exactly it. One of the things you're starting to see, with The Simpsons and others, is that in hand animation the weight of each frame is equal because you're hand-drawing each of them. With Character Animator, the mindset shifts and people focus on the hero frames. If you concentrate on setting up the pieces that can be automated in between, you'll get that fidelity, but as an artist you're much more free as you can capture the grand gestures you really want to drive home.

Did you anticipate the software being used in this way? [Laughs.] And what future use cases do you think this points to?

I chuckled because I don't think that anybody would have said our first big coming-out would be The Simpsons. But this product does change the way animation is done. It is, fundamentally, a different thing. And this isn't even a 1.0 product. All of this work is off a preview version. We introduced it last fall. David Simons, who's the primary instigator of this project, really felt we had to get it into people's hands and find people who would push the product so we could make it go farther. We knew there was a whole range of pieces to it. The live version wasn't even in the first couple of previews. By listening to Fox, we've actually amped up that side of it, with the ability to drive out a live signal. We thought it was a healthy attitude to put the preview out with core functionality and then just listen to people. Previews 1 and 2 were more about capturing high-fidelity motion into the timeline, and with The Simpsons we've pushed into the live part. It's tremendously exciting.

It seems like products stay in beta for longer and longer these days. How long do you anticipate CA being a preview version, and where's the line between that and a 1.0 release?

There are still some fundamental pieces we have to deliver. We showed some of those in our NAB reveal this year. One of them is thinking about overall workflow. If you're using it in a live environment it's straightforward, but a more end-to-end workflow needs things like integration with Adobe Media Encoder. There are other things we need to add, as well. In the current version, you do have a workflow with After Effects and Premiere Pro, but it's by rendering out frames. We've got to increase the interoperability of the products for people who want to do recorded animation. But we're definitely in the end game. Preview 4, which was revealed at NAB, will give people a very solid production workflow, and then we've got at least one more cycle to go before we get all of that interoperability with our other products into place.

adobe_red-monsterWell, it's always a highlight of the Creative Cloud product demos when Character Animator comes around.

Most of the artwork you see in those demos is driven by an insanely talented guy, Dave Warner. Check out his YouTube channel, Okay Samurai. He's a brilliant creative guy, and having him involved with all these smart engineers is like art and science coming together. And this is the one project I've been involved with where, whether it's strait-laced broadcasters or Youtubers, no matter who you show it to, you get a massive smile. It taps into something innately human that facilitates driving animation. You put anyone behind this and, within a couple of seconds, they're making funny faces.