World War 7 Brings 3D Glasses to the Rock Concert
Kaleidoscopic, Psychedelic Stereo VFX to Light Up Saint Motel Gig
Josh Ferrazzano: We wanted to make sure that all the components played off each other, as opposed to the band playing a soundtrack to the visuals or the visuals being music videos for the songs. We wanted to create a synergetic and synaesthetic experience. We’ve mapped out moments when screen events will occupy stage space on either side of the band, or on top of the band, to find that synchronicity.
How did you develop the project? Were you approached from outside, or was it conceived in house?
We created the experience from the ground up, here at WW7, without a client attached, so it could be what we wanted instead of a work-for-hire experience. It merges a number of different passions of ours, from live music to the visual arts. The first step was to find a venue that could accommodate it, so we approached the Soho House, which has a terrific event space. It’s a very comfortable screening room with a Dolby 3D projection system, 50 velvet-armed chairs, and a VIP feeling. They loved the idea. Then we reached out to our friends in Saint Motel. [Lead vocalist and guitarist] A/J Jackson works with us quite a bit as one of our video editors. They’re really creative in their live shows and are fun people to collaborate with.
Is it a very tightly choreographed show?
I’m a big fan of having everything timed out in advance, as opposed to the VJ approach. Back in the rave days of the 1990s, I used to do a lot of visuals at parties and clubs and I felt like effects generated in real time take on a screen-saver vibe. It may be audio-reactive, but you’re always playing catch-up and you’ll never reach that level of sync that blows people’s minds. We map it out meticulously ahead of time, and the DCP has a click track attached that gets sent to the drummer. We’re creating synchronicities and audio events, everything from big audio events having a big change in the visuals to little flourishes in each frame that narrate some of what we’re hearing.
What cameras did you shoot live footage with? And what kinds of scenes did you shoot?
We employed everything from RED Epics on a 21st Century 3D beamsplitter rig to Panasonic A1s in an underwater housing to DSLR cameras with a synchronized intervalometer for long-exposure time-lapse in 3D. We’re not just doing time-lapse of sunsets. We’re actually painting with lights. People have seen those effects in 2D, but not in 3D. You’re able to resolve far more layers in kaleidoscopes and composites when there is separation on the Z axis, and light paintings take on a sculptural quality once the light streaks have volume. We did stuff as complicated as underwater 3D photography with the synchronized swim troupe the Aqualillies to night-time time-lapse in the canyons of Joshua Tree National Park to lo-fi camera gimmicks. We created a mirrorbox – like an aquarium with mirrors on each side facing in and a piece of two-way glass you shoot through to create an infinity-mirror effect. You put stuff inside there, and it’s a little 3D gimmick that allows you to do in-camera psychedelics.
What software did you use in post?
Most of it was [Adobe] After Effects, and there was a lot of Trapcode Particular, Form, and Echospace. We worked in 3D space in After Effects and created multi-layer kaleidoscope tunnels to move through in 3D. We took underwater footage and added the kaleidoscopic effect by cropping and compositing the image several times. One of the lovely things about working in a computer is the 3D is always perfect – unlike when you’re submerging a camera and crossing your fingers, just hoping it’ll look good.
Well, you had us at “synchronized swimming in 3D.”
It’s the most psychedelic take on underwater synchronized swimming you’ve ever seen! We’ve got tons of immersive motion graphics and kaleidoscopes. We’re just trying to push the medium to a place where people haven’t seen it yet.
For more information: www.worldwarseven.com.