Water, Water Everywhere and Every Drop to Sink the Biggest Digital Ship Ever Built

There’s nothing complex about Poseidon‘s story ‘ a big ship gets slammed by a giant wave, rolls over, and everyone tries to escape. But it took some of the most complex visual effects shots yet to make it believable.

Director Wolfgang Petersen (Das Boot, The Perfect Storm), no stranger to water-filled disaster films, gave visual-effects supervisor Boyd Shermis three objectives. “He said, ‘I want to out-storm The Perfect Storm, out-Titanic Titanic, and I want everything to be photo-fucking-real.’ I promised him I would make it photoreal.”

All told, the film includes around 535 visual effects shots out of 900 created in post-production. “The post schedule was so compressed, we had to start shots even if there was only a remote idea they would be in the film,” says Shermis.

The Moving Picture Company (MPC; London) handled interior shots of water crashing through the lobby and ballroom. Industrial Light & Magic (ILM; San Francisco) built the digital luxury liner, created the ocean around the ship, and the giant wave that rocks the boat. CIS (Hollywood) built the ship’s galley, a corridor, the thruster room, and a raging torrent of water in the corridor. Hydraulx (Santa Monica) composited elements for the engine room from miniature shots and then built a 3D version. Giant Killer Robots (San Francisco) created a nightclub sequence, shots that take place in an elevator shaft, and 3D stunt actors. Pixel Playground (Encino, CA) helped survivors crawl through air conditioning ducts. And, Frantic Films (Los Angeles) worked on the previs.

“We spent a great deal of time in previs working with virtual camera lenses and camera angles that would make the ship look big,” says Shermis. “It’s easy to make it look like a toy boat.”

Even though sets and miniatures were used, and actors were filmed in water tanks, some of the most dramatic sequences were digital or augmented with digital sets and digital water.

Am I Blue?
Trying to shoot blue-screen in sets filled with dripping water and stainless steel caused Shermis to revive old techniques. “We couldn’t do push-button blue-screen,” he says. The problem was reflections. The blue screen in the corridor set, for example, reflected in dripping water and the wet floor turned the entire set blue. So Shermis went with a gray screen. “We’d light a screen at a gray value that was appropriate for whatever light was bouncing into the set and at a luminance value to give us a good guideline for accurate rotoscoping,” he says. “We did everything at 4K, so we were down to the grain level. We could see every hair.”

Similarly, the stainless steel galley set caused problems. For this, Shermis used a front-projection blue screen. “We managed to dig out retro-reflective screens from warehouses to get almost-blue screens in these otherwise reflective environments,” he says. The screens were made of Scotchlight, the same technology used for reflective traffic signs. To project blue light onto them, he circled the camera with a series of four-watt LED pinlights. “The light comes back 10 times brighter than the dim light that was sent out, and it comes back focused,” Shermis says. “If you don’t look through the lens, the screen looks gray. If you look through the lens, it’s bright as day.” CIS extended those shots to create the 100-foot long galley.

But that was smooth sailing compared to some scenes. Digital water and detailed CG simulations push the state of the art in this film, inside and out. When the ship rolls over, a huge ballroom filled with people turns upside down, as does an eight-story lobby. Inside the sinking, upside-down ship, MPC, with the help of Munich-based Scanline Production’s fluid-simulation engine, developed a potent mix of software to create life-threatening sequences. “We can’t roll 200 people around on a stage,” says Shermis. “We had to roll the ballroom in visual effects. And in the lobby, we wanted to boom the camera down 40 feet to the hero characters at the bottom. That leant itself better to CG.”

Outside, size mattered. “You can really only scale water to one-fifth, and I wouldn’t do that happily,” says Shermis. “At one-fifth scale, a miniature of the ship would be 250 feet and the wave would be 20 feet. It wasn’t practical.” Instead, ILM made waves with new simulation technology and created a huge, highly detailed, digital boat. “I think ILM would describe one or two of their shots as the most difficult they’ve ever done,” says Shermis. “They did 150 shots, but it isn’t about quantity. It’s about complexity.”

Inside Story
For the interior work, MPC built four massive set extensions: the ballroom and lobby both right-side-up and inverted. A New Year’s Eve celebration in the ballroom was filmed on a gimbaled rig 30 feet square so that when the waves hit, it could tip. But then, everything became CG. MPC extended the small gimbaled set, which had no walls or ceiling, into an environment more than 100 feet square and 30 feet tall. They filled it with digital people and CG objects ‘ 40 tables with tablecloths, each with five table settings, food, glasses, flower bouquets, and chairs. On the balcony level, they built casino tables, cards, and croupiers. All that plus confetti, balloons, streamers and more flew around the room when the boat rolled. The chaos was created and managed by MPC’s proprietary dynamics system, PAPI, a physics API that handles large-scale rigid-body simulations.

“We had hundreds of thousands of objects all being simulated at once,” says Chas Jarrett, visual effects supervisor at MPC. Ideally, they did the simulations in layers using one-way simulation – that is, an object tumbles and breaks apart without affecting or being affected by other objects. They started with the heaviest objects, like tables and chairs, and then added layers of people, plates, food, cutlery, glasses, confetti and balloons. Sometimes, though, the people had to interact with the furniture, so slower, two-way simulation became necessary.

For the digital people, MPC cyber-scanned more than 50 stuntpeople and created basic animation cycles using motion capture data. To make the digital people look realistic as the tumbling ship tossed them around, MPC integrated its crowd-simulation system, Alice, which managed digital extras for Troy, Batman Begins, and Kingdom of Heaven, into the rigid-body system. As a result, the team could blend motion-capture movement with physical simulations. “A motion-captured person could throw a dynamic object and then react to getting knocked over with physical simulation,” says Jarrett. To simulate cloth – clothing, curtains, tablecloths – MPC used Syflex software.

For water, which pours in through imploding windows, the team used Scanline’s FlowLine fluid-simulation software. “Because Flowline and PAPI could share information, we could have fluid pouring around objects in a fully two-way simulation,” says Jarrett. Scanline, a visual-effects studio that developed FlowLine for its own projects, entered an agreement with MPC for Poseidon, and Scanline head Stephan Trojansky worked at MPC during the production as the digital water and fire supervisor. “Visual-effects companies are quite secretive about their internal technologies, so this was a big step for us,” Trojansky says. “But we thought if you don’t try, you’ll never find out how such a relationship can feel.”

The fluid simulations were most dramatic in the lobby sequence. The lead actors worked in a four-story set with no water. Everything else was CG. In the reveal, a camera corkscrews down through the eight-story glass atrium, past people walking on balconies and riding lifts. When the ship rolls, an eight-story staircase disintegrates into rubble, one of the lifts rips away from the wall to form a bridge from one side to the other, and CG characters topple and fall. Water rises inside the upside-down environment, and objects break and fall into the water.

Jarrett describes the most impressive lobby shot, an enormous simulation created with FlowLine, rendered with Mental Ray, and composited in Shake where motion blur was added. “CG oil pours through a hole in the ceiling,” he says, “and splashes into CG water so we have a 100-foot-high column of oil. Sparks ignite the oil on the surface of the water and then the column ignites. We did all the dust, water, splashes, fire and smoke in one go with FlowLine. Of course, the sequence took weeks and weeks, but a single simulation of that scale would go through the system overnight. FlowLine was fast enough and the interface was intuitive enough that we could be experimental.”

Shermis was pleased with the result – and the process. “Scanline has done a really unique thing,” says Shermis. “FlowLine is incredibly user-friendly and accurate. The sequence where we mixed oil with water and had fire emitting smoke in one combined simulation was cutting edge.”

Equally cutting edge were shots of the ship on the outside.

Outside the Norm
The film opens with a tour de force, 4300-frame sequence in which the camera starts underwater and pulls up to follow actor Josh Lucas jogging around the deck of the 1100-foot long, 234-foot tall ship. Everything in the shot except Lucas was created by ILM. “It was the biggest render I’ve ever been involved with,” says Kim Libreri, visual effects supervisor at ILM. “All 64-bit. Global illumination. Completely insane. We see him run under the davits. We see ashtrays, people’s towels, people walking around on the deck, all computer generated. The hot tub has a fluid simulation for the bubbles. There are caustics in the swimming pool.”

The digital luxury liner had 382 cabins, 876 portholes, 681 lounge chairs, 73 towels, and views through the windows of 200 cabins plus the ballroom. Digital people populate the cabins and the decks. The front decks contain a library, bar and exercise room. The rear decks sport the cafeteria, marine room, and viewing area. Modelers built 181,579 individual renderable pieces. The ship is textured with 11 Gigabytes of mip-mapped textures. At night, the ship glows like a Las Vegas casino with 1000 CG lights and thousands of simulated lights.

A digital ocean surrounds the ship, all of which needed to be simulated to create the massive amount of turbulence caused by the behemoth boat. The fluid simulation in the opening shot required 1.4 TB, and the whole shot required 5 TB. “We rendered it on the equivalent of 1200 processors for three days,” says Pat Conran, digital production supervisor.

To simulate the water for this shot and others, ILM used its Physbam particle-level set (PLS) fluid solver, created in collaboration with Ron Fedkiw at Stanford University. The studio had used the simulation engine before, creating the movement of Terminator 3’s liquid chrome, to pour wine through a skeleton’s body in the first Pirates of the Caribbean, and for water flowing off a magic ship in the most recent Harry Potter film, for example, but never at the scale needed for Poseidon. Shermis’ goal was to have the simulation engine handle all the complex interaction between the water and the ship. The combination of Fedkiw, his PhD grad students, and Nick Rasmussen in ILM’s R&D department gave him what he wanted.

“We arrived at a fairly unified approach to getting the water surface, spray, underwater bubbles, foam on the surface, and debris all tied together with one simulation,” says Mohen Leo, associate visual effects supervisor. The only problem was that to achieve water-like behavior, the simulation needed to be run with high-resolution grids and high-res sims took between three and four days each. “The behavior of the water changes significantly between low-res and high-res grid resolutions,” says Leo. “But the high-res simulations can take several minutes per frame to compute. They require more than 8 GB of RAM and the data for a long shot can take almost half a TB of disk space.”

Stanford solved the computational problem by making the fluid-solver code run in parallel on multiple processors, and ILM moved to a 64-bit pipeline. The combination meant the ILM team could run the complex simulations overnight. “We saw details we never thought we would see,” says Libreri.

One of the byproducts of the simulation was that the engine ejected particles when the simulation was especially dynamic. So, the ILM team added gravity and buoyancy to the particles and automatically created spray, bubbles, and foam. That was good enough for a first take of cresting waves and waves crashing down on the boat. Willi Geiger, who most recently helped simulate fiery lava for Star Wars: Episode III, refined the foam by feeding the particles into a particle system in ILM’s Zeno software.

The rogue wave that hits the Poseidon, however, was art-directed; the fluid solver couldn’t do everything. “Its strength and weakness is that it makes the fluid behave like water,” says Leo. That meant it wouldn’t create a Hollywood version of a slowly moving 200-foot wave because, in reality, a wave of that length would collapse. Thus, to give Peterson the threatening wall of water that he wanted for that story point, ILM sculpted patches and used smooth-particle hydrodynamics. But once the wave hits the boat, the fluid solver takes over to handle the interaction of the boat and all its debris with the crashing water.

All in all, Shermis believes the effects in this film have broken new ground. “It’s an action-thriller picture,” he says. “The objective was to blow people away, and I think we did a good job of it.”