Nike Spot Presents Wide Range of VFX Challenges

In the new Nike spot directed by Michael Mann via Wieden + Kennedy, Portland, Asylum FX was charged with seamlessly transitioning two star football players ‘ Shawn Merriman and Stephen Jackson ‘ from game to game, stadium to stadium in varying weather conditions, day and night. This meant a ton of rotoscoping, repositioning the original camera angles in CG, modeling and lighting six stadiums and populating them with screaming fans, populating the sidelines with players, adding rain and snow to certain shots and, finally, managing and combining all these elements to work cohesively.

Click below to watch the spot…

…and then watch the making of the video showing what was filmed on set.

While Asylum had seen the boards for the spot a month in advance of the shoot, they weren’t awarded the job until a couple days before. When they arrived on set, two major decisions had already been made.
Rotoscoping and Field Replacement
The first decision was to use massive amounts of greenscreen. Around the perimeter of the football field a wall of greenscreen was erected and green covered pads lay on the field to cushion the landing of the falling players. Greenscreen against green grass proved to present a host of challenges. Bluescreen would have been a better choice but some of the jerseys were blue so that was not an option.

“The greenscreen had its advantages and disadvantages,” explains VFX Supervisor Sean Faden. “It definitely decreased the amount of rotoscoping we had to do. We knew there was no way we could shoot this at the different locations. We shot it all at the Rose Bowl. We shot on 35mm film. All the shots, except for the first two shots were shot either on steadicam or handheld running down the field chasing these guys. So having the greenscreen eliminated a lot of the roto work but because it completely cut off the field about 40 feet from the edge of the field it necessitated us extending the field in every single shot. Not only was the bottom of the greenscreen a third of the way into the middle of the field, it was also casting huge shadows on the field so to get rid of that we had to extend the grass. One of the most difficult things to do is to create a 3D extension on something like a ground that has as much detail as grass, all the lines, everything has to match exactly or you see the sliding.”

Then there was all the work done to remove the green pads in the middle of the field, requiring field replacement, more rotoscoping and painting the feet and bodies of the players as they moved behind the pads.

“We had to remove the pads and often their were players running behind the pad so we had to rebuild arms and legs to effectively get rid of the pad and effectively make it look like they are landing on hard grass,” notes compositing supervisor/Flame artist James Allen. “A lot of paint work. There were ten to twelve people working on roto for about 8 weeks.”

CG Camera Blending
The second decision made was a huge curveball. Originally the concept was to have the transitions from one stadium to another to occur on a hard cut.
“Just cutting on action is obviously a lot simpler of a setup,” Faden says. “So when we got on set we found out they wanted to transition seamlessly from venue to venue, so we knew it was going to be a huge challenge. A big bulk of the work was wrapping our heads around the concept and how to deal with the timings of blending one shot to another shot., when a shot is going to end and another one begin. We had an offline edit to kind of match to but it was up to us to figure out the blend points of where the players’ body positions would match the best.”

To make the various shots blend together Asylum took the shots into CG and adjusted the camera angles slightly to make them line up.

“We didn’t want to have the stadiums jump too much,” notes Faden. “Every shot that was filmed pretty much happened in the middle of the field. We weren’t thinking about that on set. So it was up to the CG guys to figure out where each camera had to live in space so that when you when from one shot to the next there would be a nice seamless handoff. That kept changing. If we changed a cut point we’d have to adjust where our camera would start.

“We hade do a lot of camera blending to make the transitions smooth. There are some shots where we would do an actual 10-frame camera blend of the actual camera data across the cut point and then render all CG ground and then cut out the foreground players so that Steve Jackson would be running on an all CG ground so we could seamlessly blend across all the venues.”

Modeling and Populating the Stadiums
While shooting the action at all the various stadiums the action would occur was obviously not an option, Asylum had to build the six venues from scratch and make sure the stadiums matched exactly or rabid fans across the nation would instantly pick up on it.

“We actually sent a guy around the country shooting photogrammetry stills of five different stadiums. We would have liked to do all six but Lambeau Field refused to let us shoot there. So we had to build Lambeau from scratch. But the others, Oakland, St. Louis, Chicago and San Diego, allowed us to go in their, turn the lights on and shoot to get the environment, shooting at the right time of day that it would be happening in the final project,” says Faden. “We took those stills and used photogrammetry to create a rough template of where everything should be, and then our modeling team fleshed all that out, adding railings, press boxes, banners and all kinds of details.”

The phogrammetry was done on Image Modeler and then that rough geometry was modeled in Maya. Then they had to add the crowds to the stadiums and leveraged Massive Software.”That was another tedious process of adding 40,000 people in a stadium,” recalls Faden.

Asylum also had to add the players, coaches, referees and all the other people along the sidelines for each of the shots.

“For the sideline people, who weren’t there during the shoot, we used a technique on the other Nike job we’d just finished featuring Ladanlian Tomlinson,” says Faden. “We shot hundreds of different extras in referee outfits, press outfits, football uniforms. So we were able to place them procedurally on the sidelines. For this spot we shot each team sideline that we knew we were going to have to have. In the Raiders game you see the Raiders on the sideline. For those we had about seven guys in uniforms line up in a bunch, shoot that and then have them rearrange themselves and shoot them again. We did that a few times and then layered them on the sideline so you had players on the edge of the sideline, some in the background and then layered in an empty team bench in the background. So that added a bit of depth.”

Escaping a Data Nightmare with Houdini
Then came the matter of combining all these elements so that they all fit together in each individual setup and then making the transitions from on venue to the next.

“This is a pipeline that we’ve been using for years now: We took all that data ‘ the Maya-modeled stadiums, the Massive-generated people, which had close to 500 different textures when you add up all the different teams and fans, the sidelines, the field extension ‘ into Houdini where we put it all together and rendered everything (with Mantra),” Faden explains. “Houdini has a system to manage the digital assets so that for all these complex shots and setups that had many, many different layers and elements, with elaborate camera setups, when we opened a file it would already know which shot environment you were in and you would automatically get the latest work from the various people working on the job. Because this project had so many complex offsets, time warps and all these things, we wanted to make sure that each artist was working with the latest and greatest assets. Maya doesn’t give you great control in that realm of being able to automatically update peoples’ setups,” explains Faden.

This proved crucial not only for managing all the assets and combining all the various elements but also when to came to lighting each setup.
“Our lead lighting guy, Denis Gauthier, is so efficient because he is able to create all the different lighting setups – sunny, cloud, day night, indoors, outdoors, misting and snow – and with very little effort was able to apply them to the 12 shots that we needed to do. He’d have one generic setup that then imported a stadium and imported a Massive crowd. So that as long as everything was named properly, he could open a brand new shot and within two hours had those elements running on the renderfarm. Houdini is great at that type of efficiency when it comes to layers and lot and lots of assets and workflow between many artists,” says Faden.

Houdini was also used to create the snow and mist.

The Final Grade
To make all the various elements fit in the same space Asylum worked closely with colorist Stephan Sonnenfeld of Company 3. Sonnenfeld did the original grade on the plate but since there were constantly new 2D and 3D elements being added he was at Asylum, along with director Michael Mann, to ensure it all would fit together.

“That as certainly a major part of the process, you had all these different elements, the stadium, the CG crowd, set extensions, sideline people from a different shot, and the 2D elements and you have to make them live in the same environment and that was certainly a challenge to grade all of these things separately so they’d live as one,” notes Allen.

CREDITS
Agency: Wieden + Kennedy, Portland
ECD(s): Steven Luker, Jelly Helm
CD(s): Alberto Ponte, Jeff Williams
Copywriter(s): Alberto Ponte, Ari Weiss
AD: Ryan O’Rourke
Account Team: Ryan Gallagher, Alberto Escobedo
EP: Ben Grylewicz
Producer: Kevin Diller

Prod Company: Alturas Films
Director: Michael Mann
EP/President: Marshall Rawlings
EP: Jeff Rohrer
VFX Supervisor: Robert Stadd
Producer: Bryan Carroll, Leslie Vaughn

Post/Effects: Asylum
VFX Supervisor/CG Supervisor: Sean Faden
EP: Michael Pardee
Producer: Mark Kurtz
AP: Ryan Meredith
Production Coordinator: Steven Poulsen
Compositing Supervisor: James Allen

Composite/Inferno: Rob Trent, Joey Brattesani
Inferno: Chris Decristo, Chris Moore, Paul Kirsch
Smoke: Adam Frazier, Scott Johnson

Houdini Effects/Massive Animator: Dan Smiczek
Animator: Scott Smith
Animator/Rig: Kevin Culhane
TD Lighting: Jeff Willette, Denis Gauthier
Lighter: Rob Stauffer
Modeler(s): Chad Fehmie, Toshihiro Sakamaki, Scott Brust, Greg Stuhl
Tracker/Matchmover/Stadium Photographer: Eddie Offermann
Tracker/Matchmover(s): Mike Lori, Andrew Cochrane, Tom Stanton, Devin Fairbairn
Tracker: Genevieve Yee
Setup: Brian Bell
CG Producer: Jeff Werner

Matte/Texture Painter(s): Tim Clark, John Hart, Eric Mattson, Aaron Vest, Robin Foley

Roto Supervisor: Elissa Bello
Roto Comp: Eric Evans, James Lee, Valy Lungoccia, John Brennick
Roto: Michael Liv, Huey Carroll, Meredith Hook, Junko Schugardt, Mark Duckworth, Laura Murillo, Eric Almeras, Dan Linger, William Schaeffer, Mattanaiah Yip, Ed Anderson

Editorial: Spotwelders
Editor: Haines Hall

Audio Post: Lime
Music: Trevor Jones

Sound Design: Mit Out Sound
Sound Designer: Ren Klyce
Producer: Misa Kageyama