Chris Bobotis

How 360 Video Transitions Are Becoming Narrative Tools, and What's Next in VR Plug-Ins

Mettle just launched Skybox 360/VR Transitions 2 ($149), a new package of four effects — light leaks, spherical blurs and more — that doubles the number of 360 transitions available in its Skybox Suite ($499, all inclusive) of VR/360 post tools for Adobe After Effects and Premiere Pro. Mettle became known in the After Effects space with its FreeForm and ShapeShifter plug-ins, but the company has pushed hard into VR and 360-degree video, releasing two full versions of its SkyBox Studio plug-in for working with 360-degree content in After Effects in less than one year.

With technology in the VR/360 space moving quickly, we wanted to touch base with Bobotis to find out what Mettle’s getting up to in the final weeks before NAB 2017.

StudioDaily: How did you get so heavily involved in creating 360 workflow tools?

Chris Bobotis: Our first offerings were for After Effects. As I did site visits and traveled the world, both virtually and physically, I started to see that anyone involved in 360 was getting extremely busy. They could not keep up. I was seeing both big shops and freelancers, so I had a good cross-section of pain points in the industry. I realized that After Effects artists were jumping through hoops to create transitions. All of the transitions that were shipping caused seaming problems — they were built for flat cinema, where the edges were not important. But in 360, if you’re moving a pixel over the edge of the frame, it causes a seam line. If you’re doing color-grading, you’re not moving pixels around. You’re keeping pixels in place, and just changing the hue or luminance values. But when you’re running filters, you’re actually displacing pixels, and that’s what causes seam lines in spherical, or equirectangular, video. So I was watching these artists concoct all these little recipes to handle transitions. And I thought, “Wow, we can do this programmatically.” That was the technical reasoning, but the more I thought about it, the more I realized that transitions could start to become a narrative tool.

What do you mean by that?

The luxury of a flat, or rectilinear view, is that you control the viewer’s gaze as director. You can frame and block things to get your point across and tell your story. But in 360, there is no frame. The viewer can look where they want when they want. So how do you resolve a cut or transition properly? If you’ve got time and money, you can block, and your action will direct the viewer’s gaze to help resolve that cut. The other way to do it is with audio cues — you make a noise where you want them to look, and then cut. But what if the transitions can help that along? What if I animate a point of interest with these transitions? Even our chroma leaks and light leaks may have a subtle movement, say from screen left to screen middle, that’s just enough to influence the gaze. Now, all of a sudden, we’re creating new tools, and a transition isn’t as linear as it was before.

So it’s a way to direct viewers in 360 space, similar to what directors have been doing in flat space for decades.

When you control action in a scene, you’re doing the same thing but in the subtlest way. Now, in some documentary work you don’t have that luxury, and one of the big uses for 360 video is journalism and documentary work. That became an interesting challenge for us, and it was one of the biggest motivations for us to move forward. Some of the other tools we made for After Effects and Premiere Pro were cash cows, but I saw this as a great opportunity to become a thought leader. Transitions as we know them have to be reinvented, and here’s how we’re starting to see them. It complements our core offerings. It’s like what we did with our Skybox VR Player for the Oculus Rift. You could view your content with your Oculus HMD in the Premiere Pro and After Effects ecosystem, saving a lot of rendering time. We also exposed a workspace mode, so you could continue to color-grade and tweak inside the rift. We don’t expect you to do this day-in and day-out, but if you can spend 10 to 15 minutes with the headset on, you can design and edit in context.

Making the interface accessible with the headset on must be a challenge.

One of the recent improvements we made to the VR player was a virtual keyboard. You can orient yourself by touching any key it displays in the HUD. Most editors are comfortable with their left hand on the keyboard and their other hand on the mouse. So we’re going to expand on this — working in VR as much as creating in VR. But the best thing about the whole VR player is we decided to let people use it for free as a marketing tool. It’s our way of thanking the community, but also showing you that we think slightly differently. We actually understand and respect this new medium.

Who do you see as your target user?

We cater to somebody who’s very new to After Effects, who has taken just a day or two to understand how it works, through to the very experienced user. You can tap into certain modules we ship that are skewed to advanced users, but if you want to do simpler things, we’ve got something in there for you, too. You don’t force-feed them. As they build more content, they’ll find compelling reasons to move over to After Effects.

I was asking because it seems like more and more capability is being unlocked in Premiere. That means you can do more in the NLE — but it can also whet your appetite for the more powerful stuff in After Effects.

You’re seeing that divide close. Another reason we’ve started porting our products to Premiere Pro is the insane GPU renderer. There’s so much you can do so much faster in Premiere Pro. If you can do 75 percent of the work a lot faster [in Premiere] it frees up your time to do the finicky work in After Effects. It’s a huge productivity boost.

How many of your tools currently support stereo 360?

All of the Premiere Pro products support stereo 360. The After Effects products have started to support it in a limited fashion. We have internal builds that do full 360, but we haven’t released them because the render times would be way too long given the architecture in After Effects. We already have working prototypes of everything. Last November, you started to see Gaussian blurs and other filters move into the same GPU renderer that Premiere Pro users have. Moving forward, you’re going to see After Effects go a lot faster, and stereo will become more of a thing for us.

What are you currently seeing as far as the uptake of 360 in the content creation community?

If we travel back 14 months, when we started this venture, there were a lot of naysayers. A lot of people were just dabbling, but those dabblers now cannot keep up with the workload. Some small shops that caught my interest by working on the periphery with two or three people at most have grown to 30 in the span of a year, and all they’re doing is 360. So I’m seeing great success stories. Anyone involved in branded content can’t keep up. One of our customers is The New York Times. USA Today is another big customer.

What’s next for Mettle?

We’ve got eight 360/VR transitions shipping now. There are at least 20 more that we can build. These are things we haven’t seen in flat cinema before. You’ll see those deployed in batches, and we will always try to keep things as low-cost and accessible as possible. All six products that we ship right now are under $500. We’re going to try to keep it that cost-effective going forward.

Can you give us a preview of anything specific we might see at NAB? 

One thing we will be launching is a whole new package called MantraVR. MantraVR is all about stylization. I’m not comfortable saying too much, but things that you think are impossible right now in 360, MantraVR will do. We’re actually making it the focal point of our booth, which is going to be interactive so you can come up and play with some of this technology.

Anything else you want to cover?

I’d just ask that you link people to our blog, where they can read 15 months of curated success stories featuring these tools.