Ross Shain

The Oscar-winning planar-tracking technology developed by Imagineer Systems has made Mocha Pro a workhorse for tracking and compositing tasks, including image-stabilization and object removal. The software’s utility has only grown along with the capabilities of standard desktop computing hardware. When stereo 3D conversions became a new, labor-intensive factor in blockbuster film releases, Mocha was optimized for stereo conversion work. And now, with the industry’s intense interest in immersive 360-degree video and VR experiences, Imagineer and parent company Boris FX have launched Mocha VR, which is tuned for 360-degree post-production workflow. The software was released earlier this year, but many users will get to see it working for the first time in demos at NAB 2017 later this month. We asked Imagineer Systems CMO/CCO Ross Shain to fill us in on what went into the new VR toolset, and how customers are using it.

StudioDaily: What’s the primary challenge of getting your tools to work in a 360/VR environment — is it just allowing people to visualize and work inside the 3D scene, or is it more making sure the tools are working across seam lines [from multiple cameras]? What all goes into that effort?

Ross Shain: It’s both of those things. Also, the native algorithms for tracking and object removal need to support the equirectangular workspace. Equirectangular pixels are very distorted at the top and bottom. Traditional point-tracking doesn’t work that well, and rotoscoping and painting on distorted views is challenging. If you’re a VFX artist and you’re trying to remove the camera at the nadir of the scene, it looks very wonky in equirectangular — it’s stretched all over the bottom of the image. You have to take a wild guess at what it’s going to look like.

Basically, it’s Antarctica [in an equirectangular or Mercator world map projection] down there.

Yeah, the pixels look huge, but you’re actually working on a very small area.  The concepts of 360 and panoramic images have been around for a while in still photography, and [before Mocha VR] some customers were employing those methods, like PT [Panorama Tools] and Hugin. There are open-source tools that allowed them to create STMaps, which are like a distortion map, and they would unfold those distorted pixels, move them to the center, and then flatten them and paint out the camera rig. Then they would redistort them and move them back. We said, “You know, if Mocha could actually support both of those views, and the tools worked in either a distorted view or a flat view, the artist could pan around within a sphere to do tracking and masking and then flip back and forth between the views.” We always had a lens module inside in Mocha — even back in the day — so we adapted the lens module for equirectangular-to-rectilinear translations. It’s a very powerful workflow. Everyone wants to create an immersive experience and move the camera around. Having a native tracker and native masking tools that actually work in the proper distorted space is a real time-saver.

 

In addition to tracking and roto, object removal is a powerful and unique thing. It’s similar to Photoshop’s content-aware tools that analyze pixels from other parts of the frame, but ours is also using temporal frames in time. Say you’re looking straight down and want to remove a camera going across a patterned floor that has a lot of details. In some workflows, the artist would just XY clone from some offset pixels, but this looks for clean areas in time where the camera is not in front of those pixels, either before or after those frames, to auto-patch it in.

And its algorithm is smart enough to know that it has to look for the background as opposed to something occluding the background by analyzing the whole clip.

Exactly. It can take a long time for a VFX artist to train up on painting and cloning and tracking. But we can train editors, who aren’t coming from high-end VFX, on how to do object removal pretty easily in Mocha. And this is very interesting in the 360 world — our traditional customers are VFX and motion-graphics artists, but we’re meeting a lot of people coming to it from the camera side, or just out of an interest in creating 360 content. It’s important for us to create easy-to-learn workflows for them.

You mentioned stereo 3D earlier, and when I used to talk to people establishing those 3D conversion workflows, they’d say, “Well, the thing about 3D is that all of a sudden every shot in your film is a visual effect.”

Exactly.

It’s the same thing with a good 360 experience. Every shot becomes a special effect. You can’t get — well, maybe sometimes you can — but often you can’t get a good 360-degree shot in camera. A bad 360 shot can make people sick. And generally there are compromises you make [on set] that you’re going to want to fix to make the finished experience more absorbing.

Absolutely. A lot of the people we talk to who use our software all talk about how important pre-production is. Every single person on the crew needs to hide, or they need to stand in a place where it will be relatively easy to paint them out. There are issues with lighting and shadows. DPs who are used to moving the camera in a certain way have to learn. Stitching becomes an issue. Objects that come close to the camera from certain angles can be a real challenge for stitching. There are lots of technical considerations just about how the camera is place and how things are framed.

Will you be doing stitching in Mocha VR?

We looked at stitching, and we thought we had some interesting tools that could be applied. But I personally think stitching is something that will be covered more and more by the camera technology. We’ve seen it on the low-end consumer cameras — just a two-camera system as opposed to a six-camera system — but I don’t think stitching, over the long term, is going to be the biggest hurdle. One of our customers, Koncept VR, uses Mocha a little bit to fix the stitching. They’ll do a pass or two of stitching in GoPro Kolor, but say someone on screen is moving diagonally toward the camera. You’ll see them ghosting as they go across a stitch. This customer is doing a lot of rotoscoping in Mocha, and they’ll get a clean area from either side of the stitch from the original camera so they can rotoscope across it, replacing a person as they go across the stitch so it’s not as noticeable.

So the tools can clean up some artifacting after the stitching takes place.

Yeah. We didn’t design it for this purpose, but because Mocha is a workhorse VFX tool used for all kinds of clean-up, people are employing it to fix stitching.

The last thing to mention is that — and you brought this up earlier — a lot of the negatives in 360 are nausea-inducing experiences. When people have a bad first-time experience with a headset, they’re never going to want to put it back on. And a lot of filmmakers are moving the camera around on drones, on rigs, and on cars, so jittery footage can be a real problem. That was one of the big things customers told us when we were in beta. They were looking for new ways to stabilize. So we came up with something we call Horizon Stabilization. We use the planar tracker to track an area on the horizon. A lot of times things on the horizon are out of focus or occluded, and it takes advantage of the power of the planar tracker, so we might track some clouds or some area on the horizon. And then we stabilize the relationship between the horizon motion and the camera itself. It’s cool because in 360 you have access to all the pixels. In traditional stabilization you end up having to scale the image if you’ve locked something down. But in 360, we use the seamless pixels to stabilize the motion. We’ve had a lot of great feedback on this.

If the shot is really bumpy, it seems like you’d still have some issues with parallax or other effects you couldn’t account for entirely. 

If the camera’s super-jittery you’ll get a lot of motion blur and all kinds of distorted pixels. But as long as we can track something we can employ a smoothing effect where the motion might still be there but you’re reducing the high frequency jitters. Stabilization is better handled by a gyroscope or something hardware-based on the camera rig itself. But a lot of productions don’t have access to those tools.

What kind of presence is Mocha VR going to have at NAB?

Mocha VR just shipped, and because it’s a plug-in it’s bringing 360 tools to a lot of hosts that don’t necessarily have native 360 tools. So we’re the first and only 360 tool for Avid Media Composer. We’ve been working closely with the Avid team, and they’re very excited about it. So we’ll be at the Avid Connect event [April 22-23 at the Wynn Las Vegas]. Premiere Pro is the editing system most of our customers are using for 360, and then a lot of them are going into After Effects or Nuke for their VFX. The ability to do some of the tasks — stabilize the horizon, do masking and tracking, and remove objects — in Premiere is a huge thing and a big time-saver.

Less expensive than having a Nuke artist do it, too.

Nuke with Cara VR is an incredible system, but there’s definitely an entry fee. So Premiere is a big host for Mocha VR, and we also support OFX, which lets us support Blackmagic Resolve and Fusion, Magix Vegas Pro and other OFX hosts.

And there are some things we can’t talk about right now, but we’re hoping to make interesting announcements. Our parent company, Boris FX, owns Sapphire, so we’re sitting on a lot of technology that could be optimized for 360 as this field grows — a lot of tools built specifically for editors, like transitions, titling, and all of the lighting effects Boris and Sapphire are famous for. I’m not going to promise anything now, but we may be showing some other interesting 360 features at NAB.

Watch video demos of Mocha VR’s 360-degree feature set at imagineersystems.com.