Joint Stabilization and Direction of 360 Degrees Videos

Published In

ACM Transactions on Graphics

Document Type

Citation

Publication Date

1-2018

Abstract

360◦ video provides an immersive experience for viewers, allowing them to freely explore the world by turning their head. However, creating highquality 360◦ video content can be challenging, as viewers may miss important events by looking in the wrong direction, or they may see things that ruin the immersion, such as stitching artifacts and the film crew. We take advantage of the fact that not all directions are equally likely to be observed; most viewers are more likely to see content located at “true north”, i.e. in front of them, due to ergonomic constraints. We therefore propose 360◦ video direction, where the video is jointly optimized to orient important events to the front of the viewer and visual clutter behind them, while producing smooth camera motion. Unlike traditional video, viewers can still explore the space as desired, but with the knowledge that the most important content is likely to be in front of them. Constraints can be user guided, either added directly on the equirectangular projection or by recording “guidance” viewing directions while watching the video in a VR headset, or automatically computed, such as via visual saliency or forward motion direction. To accomplish this, we propose a new motion estimation technique specifically designed for 360◦ video which outperforms the commonly used 5-point algorithm on wide angle video. We additionally formulate the direction problem as an optimization where a novel parametrization of spherical warping allows us to correct for some degree of parallax effects. We compare our approach to recent methods that address stabilization-only and converting 360◦ video to narrow field-of-view video. Our pipeline can also enable the viewing of wide angle non-360◦ footage in a spherical 360◦ space, giving an immersive “virtual cinema” experience for a wide range of existing content filmed with first-person cameras.

Description

© 2018 Association for Computing Machinery

Locate the Document

https://arxiv.org/pdf/1901.04161

Persistent Identifier

https://archives.pdx.edu/ds/psu/29349

Share

COinS