Australian Post House Pushes High-Res, High Frame Rate Footage Through DaVinci Resolve Studio

Discovery Channel has gotten into immersive video in a serious way this year, with iOS and Android smartphone apps and a new VR website delivering 360-degree experiences. One of the first shows generating 360-degree content is Mythbusters, which shot underwater shark footage for August's "Mythbusters vs. Jaws" episode back in May.

Mythbusters shoots both static and POV VR footage with six- or seven-camera rigs, capturing images in a 4:3 aspect ratio at 80fps. Why 80fps? Well, it has to do with the peculiar nature of multi-camera VR footage. In this case, it has to be stitched together to create a 4096×2048 MP4 file with a 2:1 aspect ratio. Once the footage is stitched, it's important that the seams not be obvious, and a higher frame rate helps with that.

"The rigs we use currently don't have genlocked cameras," said Anthony Toy, director of post-production at Beyond Productions. "What this means in practice is that any one camera can be out of sync by up to half a frame, and this can be noticeable on the stitch line if there is a lot of fast motion." As Toy noted, half a frame at 30fps is .017 seconds, while half a frame at 80fps is just .006 seconds.

"As far as synchronization is concerned, 80fps reduces the scale of error," explained Ambar Sidhwani, junior editor and virtual reality editor at Mythbusters. "As far as calibration goes, the more accurate the control points to match, the more accurate the calibration process. Accurate control points are achieved with more frames available. So in essence, shooting in 80p gives us more information per second for all the processes used in stitching the image."

4K footage captured at 80fps isn't exactly a common post-production format, so Beyond needed grading and finishing tools that would handle different aspect ratios and frame rates natively, without transcoding. DaVinci Resolve Studio fit the bill, allowing smooth, frame-to-frame conversions. Since the final deliverables are at 30p, that gave the team the choice of maintaining a slo-mo look (80fps footage slowed down to 30p) or of accelerating the footage in Resolve.

"If you're underwater, you may not need slo-mo," said Sidhwani. "However, if you're up on a drone with a shaky gimbal, those extra frames to create a more smooth motion will go a long way."

"If slow motion isn't appropriate, then we simply speed the footage back up to play in real time," added Toy. "Ultimately, it would be great to have the option to play back smoothly at 80p or 60p HFR if [future] VR playback devices are up to it, as this creates a more real experience with smoother motion — similar in principal to using HFR for 3D films."

In addition to maintaining that oddball frame rate, Resolve offered a full complement of powerful color-grading tools to restore vibrancy to an image that had been flattened somewhat by the stitching software, which seeks to balance color across all of the images. "The original footage is from a compressed 8-bit source, so adding saturation and contrast to the image really brings out the compression artifacts and banding," said Beyond Senior Online Editor Michael Graham in a prepared statement. "A combination of working in a 32-bit environment in Resolve, the de-noise filter, and other elements of the software allows us to push the image a lot further and really work it to achieve the look we are after without having the image look too processed."

Color-grading took place on a computer monitor since the file resolution doesn't conform to any broadcast standard, Toy said, noting that Resolve's built-in scopes helped the team keep the image within specs.