While news from the exhibition floor of SIGGRAPH was a little ho-hum, there were a number of excellent panels and presentations upstairs at the convention. One of the most popular was a full day on stereo 3D and how filmmakers are trying to move away from stereo as a gimmick towards it being a true cinema language, just like color and sound. Representatives from Sony, Pixar and Dreamworks discussed the present challenges and strategies, as well as the future of 3D as a medium.
Up and Toy Story 3D
Bob Whitehill, stereoscopic supervisor at Pixar, used Up and Toy Story and Toy Story 2 to illustrate how they tried implement a stereo language into the films. “We are trying to use a more subtle, kinder 3D. We want to answer the question Ã¢Â€Â˜Why is this movie in 3D?'”
Up, the story of an elderly man, recently widowed, who ties balloons to his house and goes on a spectacular aerial adventure, implemented a visual language of circles and squares to structure the layout and how stereo imaging would work in the film.
“We used this general idea in the planning and layout stages to design the film. We used long lenses on the square sequences and wider lenses on the circle ones, and had a relatively shallow space in the square scenes and deep space in the circle scenes,” notes Whitehall. “We created a graph of how we would use stereo. In the beginning when the character is happy there is a deep space, then it flattens out when he loses his wife and then it slowly increases throughout the film. Just like the lack of color in dark scenes make the vibrant images stand out more, so do the flat scenes enhance the scenes where you are more aggressive with the 3D.”
On Toy Story and Toy Story 2 the 3D language was more complex. “We approached the 3D in these films in two main veins: environment and emotion. In the environment you have the toy world versus the human world. I wanted the toy world, where the toys are together and have a community, to feel safe so the space was flattened out. When the toys are alone in the human world the depth is much greaterÃ¢Â€Â¦. This changes through the film. We also broke the film down into themes of safety versus risk. So while the shallow space is comfortable it becomes untenable. It is not a place you can remain in. You want to make the choice to experience the richness of life, so as the film progresses we dial up the stereo.”
Whitehall theorizes that it will take some time for filmmakers to truly understand how to use 3D, just as it did for sound and color. “In the future I imagine success in 3D is going to happen when we marry the Ã¢Â€Â˜wow’ moments of 3D and also learn how to use it subtly to emotionally tell a story.”
Monsters vs. Aliens
A team from Dreamworks Animation discussed how they built an entire 3D pipeline for Monsters vs. Aliens so they could author in stereo from the earliest stages of layout all the way through final QC. While Dreamworks had produced 3D projects for special venues, Monsters vs. Aliens was its first feature film, which was a “different animal altogether” notes Ken Bielenberg, visual effects supervisor. “We wanted to stay away from using stereo as a gimmick. That tends to pull the audience out of the movie. The things that worked for us creating projects for theme parks didn’t work for a feature. We had a clear mandate not to use stereo as a gimmick and only use it as part of the storytelling process.”
Authoring in stereo required building solutions so different artists to view the project in 3D at any time. “We did a lot of work to figure out how our artists could preview stereo at their workstations – every artist from editorial through lighting. It required separate hardware/software solutions for all of those different setups. It took a long time to figure out but we knew if we didn’t make it easy for artists to preview they wouldn’t work in stereo. Each department dealt with stereo but each one was impacted to varying degrees,” explains Bielenberg.
While the engineers were handling the technical aspect of the 3D pipeline configuration, the artists were going back to school to learn about the artistry of stereo. The first thing they did was to hold a hands-on stereo photography workshop for their layout artists. These artists went out and shot with the stereo still cameras and then the Ã¢Â€Â˜class’ to reviewed the images. “They came back with some great stuff and some stuff that was absolutely horrific,” says Phil Ã¢Â€Â˜Captain 3D’ McNally. “But they learned what made good 3D and what made bad 3D. It gave them the basics of 3D and how to view environments.”
With the basics of stereo photography in hand, they then had to understand the concepts of how stereo works in a theater and how the eye perceives a 3D image on a flat screen. Normal techniques that work in 2D fillmmaking can literally cause headaches in 3D.
“In 2D we use a 50mm lens a lot but what we found is that a 50mm lens in stereo is quite an aggressive setting when you introduce the background and foreground. Why is 50mm painful when it matches human vision? Because the theater doesn’t match human vision,” says McNally. “So finding a lens that matches the viewing space of a theater can make your stereo work easy. We found a 24mm lens is the easiest to set. When the lens is longer is tends to flatten out the characters and make them look like cardboard cutouts when they are in stereo,” says McNally. “The whole point of stereo is re-thinking filmmaking in a spatial way and you have to get a grip on basic skills and theories before you can actually think about what you want to do with stereo.”
Editing a 3D film becomes a big challenge since in addition to getting the cuts to work n a linear fashion, now they also have to work spatially.
“Between a cut, the stereo effect often jumps from behind the screen to in front. It’s not as simple as making the shots the same,” McNally explains. “Otherwise your whole movie would look the same with everything behind or in front of the screen. So we animate the depth across the cut and used it extensively. There have been studies that show it takes your eyes a fraction of a second to adjust from looking at something near or far. So we essentially do the same thing. If one shot has the effect behind the screen then we go to another cut, we animated the stereo effect so the first frame is behind the screen and then each frame we bring it forward and after 10 frames we have the correct stereo setting we want for that shot. It is something you should absolutely not notice in the movie but you need to use it to blend shots together.
“Your first job in creating stereo is to do no harm. Then your job is to build creatively on top of that. There’s a real core toolset of knowledge you need to be able to create a film that audiences can watch for 90 minutes. But there’s also some room to play around and I think having the film be in stereo from the beginning all the way through production really helps creatively and technically.”
Mahesh Ramasubramanian, digital supervisor on Monsters vs. Aliens, discussed the visual artifacts that invariably pop up in a stereo production. He identified a number of common problems and how they were solved.
- Texturing. Problem: Often a shot looks fine until the texturing and lighting is added. The lighting reveals the depth setting on the shot and causes eye separation in the foreground elements. Solution: Adjusting the position of the camera to remove the extreme foreground.
- Incorrect reflections. Problem: Reflections look stuck to the object rather than going through them. Solution: Check your Renders. This happens when you mistakenly render the reflections with the wrong eye camera setting, which makes the reflections look like shadows stuck to the flooor rather than going in depth beneath them.
- Camera mismatches. Problem: The layers don’t look cohesive. Solution: When you are rendering different layers, different lights at different times, or when changes occur, it is a common error that the settings are not all the same. This is probably the most difficult to catch because it is subtle changes.
- Ghosting. Problem: A double Image. Solution: When a little bit of light is leaking into the other eye,which results in a subtle double image. This is most noticeable when there is a combination of high contrast and a lot of eye separation. We have two controls here. We can creatively de-contrast the image and pull it a little closer to the screen. Additionally we can use a prodecuderal cross cut canceling software that identifies the narrow regions around the edges and de-contrasts them.
“One might think that matte painting would play less of a role with stereo. But this is hardly the case,” explains Ramasubramanian. “The consequences of having a huge robot in the film is that we have big, wide sets. We had to rethink the way we painted clouds. Normally we would paint the clouds on a dome. But for this we projected out painted clouds onto planes that were separated from the sky. This really helps with the spatial location of the clouds. You still have to paint in perspective. You have to have a motion parallax between the layers. We ended up stacking the clouds, but this was still not enough. We expected more depth from our matte paintings so we had to come up with ways to create hyper-depth, which is not optically correct but helped bring out the matte painting.”
For a shot of space ship traveling through clouds they employed a number of techniques to achieve depth. “In compositing we separated the cloud layer by one pixel from the ground beneath it. Then we used a multi-camera rig and had the spaceship in one camera and the rest of the set in another camera so that the spaceship was separated from the clouds. Finally we projected the foreground clouds onto volumes to add more depth to the shot,” says Ramasubramanian
Benefits of Authoring in 3D
While there were significant costs in getting every artist in the studio set up for stereo viewing, in the end it saved money and added to the creative.
“Because we were doing the effects before the editorial was locked we could influence the way the picture was cut in a way we couldn’t if we were treating it as a post process. In many cases effects department would request changes about lens choice, position of the camera and in certain cases adding shots in order to get the most depth from the effects we were doing. When there were problems we could deal with them as they arose instead of having to go back to the beginning and do it all over.”
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.