DP Tim Dashwood explains what it took to turn designer Nada Shepard's latest line into a 3D Kung-Fu fantasy

The 3D fashion experience debuting next month during Toronto’s Fashion Week may not be the first 3D film to hit the catwalk, but it certainly will be the most complex.


Earlier this month, Native Son’s Kyle Fitzgibbons introduced his Fall 2010 collection with a three-minute live-action 3D film, shot by GQ photographer Eric Ray Davidson, during New York Fashion Week. On February 23, Burberry kicked off London Fashion Week by streaming its Fall/Winter collection in 3D to studios in New York, Dubai, Paris, Tokyo and LA.

But Toronto-based designer Nada Shepard was interested in creating something more than a “you-are-there” experience for her viewers. Instead, she asked DP Tim Dashwood, creator of the Stereo3D Toolbox plug-in, to produce her vision of a gamer drawn into a role-playing alternate universe, complete with multiple costume changes. “We’ve been working in secrecy on this for a long time,” says Dashwood. “Nada was basically completely bored with runway shows. Her concept was to do something that will catch attention and be really cool in itself. When we first started discussing this with her she was clear that she wanted to do live-action combined with CG backgrounds, and also to bring Hong Kong-style fight sequences into the mix.”

Shot entirely on greenscreen, the film features both a fashion model and stunt woman who appear in different outfits from Shepard’s line in each scene. During the shoot, Dashwood and team keyed the actors/models over the rendered backgrounds to get an idea of what the final product would look like for “every single shot.” An avatar selection screen will simulate an interactive gaming experience by moving the audience through game player choices for weapons, scenarios and outfits. “It would be completely cool to have it be truly interactive,” says Dashwood, “but we didn’t have anywhere near the budget for something like that.”


A fashion model, left, mixes it up with a trained stunt woman

Paul Rapovski, Dashwood’s partner in his 3D production company, Stereo3D Unlimited, happens to be one of the leading fight stunt coordinators in the industry (Max Payne, The Chronicles of Riddick). “We also have here in Toronto the only commercial motion-caption studio, so we had our three selling points,” says Dashwood. “But we’ve never done all three together before. We’ve done video games with mocap, we’ve done 3D and we’ve done fight stunt work. This was a terrific opportunity to put it all into one project.”

The motion capture for Shepard’s project, says Dashwood, had a twist. “Instead of mocapping the actual actors, we mocapped the camera movements. By doing that, we were able to generate real-time previews of backgrounds so we could see what they would look like from any particular angle.” Dashwood and his team used Autodesk’s MotionBuilder with their mocap system. “Our engineer figured out a way to do stereoscopic output,” he says. “MotionBuilder is essentially a games engine that renders real-time proxies. It worked really well for us in this case. We usually just use it for pre-vis, and this was a pretty intense pre-vis experience.” Michael Irwin, he adds, is their “resident in-house genius who runs the mocap.”


A suited stunt man moves a lead pipe through the air,
which will spin in 3D toward the audience in the final film.

Dashwood shot the short film with two Iconix cameras, which he custom rigged, and recorded to HDCAM SR. “Because we were doing fights, we had to do the complete shoot in three days,” he says. “It was more like a commercial shoot, and that worked out to one fight per day. We didn’t have time to use beam splitters. We know that our interocular had to be quite small. I did the calculations for the majority of our shots, which we pre-planned. And our interocular had to be between 35 and 45 mm most of the time.”


Tim Dashwood with his custom Iconix stereo rig

He chose the Sony SRW-1 HDCAM SR VTR deck because it can interleave the left and right signals into a single signal. “I input that signal coming out of the deck,” he says, “to our Mac Pro on set. Using a live processing version of my plug-in, Stereo3D Toolbox, as a self-contained app, I was able to adjust convergence and adjust for disparities and show a live, corrected image on the JVC 46-inch 3D monitor. The client was able to sit back in the client area and see live 3D, in its correct form, on the monitor.”

On Set 3D Post
Minimizing the time from shoot to edit was the team’s number one goal, says Dashwood. With the SRW-1, however, he was able to digitize all the dailies immediately and start editing right away. “On set we took that interleaved signal that the deck produces and digitized it, so there was no extra time spent digitizing left and right eyes,” he explains. “We put together the rough cuts, they become fine cuts, and then they become locked. The point they become locked we then captured the separate left and right streams, in full 1920×1080, for just those pieces we actually needed in the cut. This way, we’re not wasting any time digitizing stuff we aren’t going to use. We also were recording to a single HDCAM SR tape in dual-stream mode, which means that both eyes are perfectly synchronized on one tape with the exact same timecode. The deck can also output the left and right eyes on separate outputs.”

After the full HD capture, Dashwood used his plug-in, Stereo3D Toolbox, to adjust for convergence. “We basically sent an EDL to our background effects folks in LA, and they generated the fully rendered backgrounds from the mocap data,” he says. “They sent us back stereo renders for both eyes. We composited in After Effects and added in particle generators. For example, we have a scene with snow, so we used Apple Motion to create that effect.”


The client, Nada Shepard (in hat), on set with Dashwood

The final scenes were ouput as a series of DPX images, which will then be converted into a digital cinema package by an area lab (at press time, still under discussion). Dashwood, who only supervised picture post, says the audio mix was handled by another facility.


The Stereo3D Toolbox interface

Was he satisfied with the results with such a tight schedule? “We were very, very conservative on our 3D, since this will be displayed in a cinema setting,” says Dashwood. “Even though we have our big JVC monitor, we still had to go back to our 10-foot screen in the studio to make sure it is working. Not a 30-foot screen, but it’s a lot closer to what it’s going to look like than 46 inches. That’s why we did all the math along the way. This project is really a test for version 2 of my plug-in, which has the math built into it. You’ll be able to tell it that you’ll be projecting on a 30-foot screen and that you want your maximum parallax to be 0.3% and it will draw a grid for you.”

For more information about Stereo3D Toolbox and the upcoming version 2.0, which is scheduled to ship at NAB 2010, click here.