How the Virtual Star of Ted Showed Up at the Oscars
The virtual star of “Ted” presented at the 2013 Academy Awards in Los Angeles, with co-star Mark Wahlberg. Credit: Universal Pictures/ Tippett Studio
So how does an animated character “appear” at the Oscars? Dell Precision’s Scott Hamilton interviewed Scott Liedtka, visual effects supervisor for Tippett Studio, just prior to the broadcast to find out.
Can you give us some background on Ted’s appearance at the Oscars and Tippett Studio’s role in making it happen?
When we heard that Seth MacFarlane (creator and voice of Ted), was hosting the Oscars, we were hoping to get the call that the Academy wanted Ted to do something on the broadcast. We had no idea what they might come up with and thought it was fantastic that they were able to get Mark Wahlberg to appear as well. The thing that makes the movie Ted great is that Ted’s and Mark’s friendship feels natural and believable, so it’s fun to see them appear on stage and together again.
Which award category did they ask Ted and Mark Wahlberg to present?
Ted is presenting the Sound Mixing award and Mark is presenting the Sound Editing award. They present both awards together and trade some banter.
Will Ted behave?
The only thing I can say is that Ted is genuinely excited to be on the Oscars but doesn’t really know how to behave. Ted will show some respect for being on ABC during prime time, but his personality will still come out.
How exactly will Ted appear at the Oscars?
What do you mean? Ted is real. Okay, Ted is not real, but I’m not spilling the beans on that one.
What was the biggest challenge when working on Ted’s presentation at the Oscars?
We had two big challenges. First we had about one-third of the time to finish a shot as we did on the feature film, and second, we had to create a tuxedo for Ted to wear. Cloth simulations are always tough to get right and for our lead cloth effects artist we got a monster box – a Dell Precision T7600 workstation fully loaded to capacity with Intel® Xeon® processors – so he could push through a ton of iterations on the cloth sims. He would get this wicked smile on his face as he put more and more load on the machine trying to get it to crumble, but the box never even stumbled. It was amazing.
Although the shots for the Oscars are pretty simple, just Ted over the background, the frames are quite complex to render. He has a lot of hair and fairly complex shading which requires a number of intermediate files to be created. Also, we needed to render Ted out at 60 frames per second to match the rest of the broadcast, which is 2.5 times the normal number of frames we would render for the feature. All told, our render times were about 18 hours per frame.
With the swarm of Dell PowerEdge cloud servers and Dell Precision workstations we were working about 4,000 cores giving us a rendering capacity of close to 96,000 hours every day. The Dell Precision workstations with Intel® Xeon® processors are powerful enough that we can bring them into the render farm at night and give us about a 30% boost in our ability to turn frames out.
Were there any other challenges?
Time, time and time! Typically three times more work than our standard week’s schedule. And then the added challenge of running and sharing other shows on the same computer infrastructure. Dell’s ability to supply new Dell Precision workstations and Dell PowerEdge C5220 servers in record time helped greatly in maintaining harmony between the shows and getting Ted out in time.
What was your workflow for creating Ted?
The workflow followed standard procedures for this kind of work, but it was interesting in a couple of ways. At Tippett Studio, we created a digital Ted asset that included his model, color, texture and fur as well as sophisticated controls to make it possible for him to move – both an animation rig and a simulation setup. For each shot that was filmed, we collected references that made it possible to put Ted into the shot with a matching virtual camera and lighting. The shot would be run through each production department resulting in a beautifully integrated Ted, usually combining the efforts of about six or seven different artists, not counting general production support. So, it’s a huge team effort.
One interesting thing about the Ted production process is that Seth MacFarlane actually performs as Ted. He wears a motion-capture (mocap) suit that captures his body movements, and his vocal performance is captured at the same time. There were many scenes, mostly action scenes, that were keyframe animated, but a lot of the shots where Ted delivered a line were mocap-animated. The mocap system was really convenient for Seth and gave great results, but a lot of clean-up was standard, plus all the lip synch and facial animation was created by our animators without the head start of mocap. Seth was very clear that he wanted clean, subtle animation on Ted to make him feel real and he appreciated the realistic noise that the mocap process captured.
Another interesting thing about the production process was the fact that Seth was acting as Ted live with the other actors. Usually, computer graphic (CG) characters have their lines recorded first, so they are pretty set in stone unless you want to start the whole animation process over with each line change. Because Seth was recording the lines and body animation live, he and the other actors could improvise and come up with more natural-sounding interchanges. Also, their lines could overlap each other, which again, make scenes feel more real.
What Dell technology and solutions did you use to create Ted?
We used a bunch of Dell Precision workstations including T5500, T3600 and new T7600 tower workstations with Intel® Core™ and Intel® Xeon® processors. Specifically for Ted we added about 20 Dell Precision T7600 workstations in less than a week – some of which were used by artists and the rest performed like a charm on the render farm. The added capacity of Dell Precision workstations and Dell PowerEdge cloud servers were instrumental in getting Ted ready to walk on the big stage.
Dell workstations with the Intel® processors allow our artists to get better simulations, more resolution, and more iteration. From a performance standpoint the Intel® CoreTM processor family is just excellent for us. We are able to get a huge amount of throughput. The hyper-threading is an excellent option in the fact that it provides the great clock speed performance for us for animators that are running applications that require multi-threading.
What digital content creation (DCC) software did you use for the project?
We use the latest Maya, Renderman, Nuke tools for shot production. In preproduction we add Mari and Photoshop and for production tracking, we use Shotgun. We also use a whole suite of proprietary tools for camera tracking, lighting, simulation, fur grooming and rendering, and general pipeline glue.
Ultrabook, Celeron, Celeron Inside, Core Inside, Intel, Intel Logo, Intel Atom, Intel Atom Inside, Intel Core, Intel Inside, Intel Inside Logo, Intel vPro, Itanium, Itanium Inside, Pentium, Pentium Inside, vPro Inside, Xeon, Xeon Phi, and Xeon Inside are trademarks of Intel Corporation in the US and/or other countries.