Planar-equirectangular image stitching

3Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

The 360° cameras have served as a convenient tool for people to record their special moments or everyday lives. The supported panoramic view allowed for an immersive experience with a virtual reality (VR) headset, thus adding viewer enjoyment. Nevertheless, they cannot deliver the best angular resolution images that a perspective camera may support. We put forward a solution by placing the perspective camera planar image onto the pertinent 360° camera equirectangular image region of interest (ROI) through planar-equirectangular image stitching. The proposed method includes (1) tangent image-based stitching pipeline to solve the equirectangular image spherical distortion, (2) feature matching scheme to increase correct feature match count, (3) ROI detection to find the relevant ROI on the equirectangular image, and (4) human visual system (HVS)-based image alignment to tackle the parallax error. The qualitative and quantitative experiments showed improvement of the proposed planar-equirectangular image stitching over existing approaches on a collected dataset: (1) less distortion on the stitching result, (2) 29.0% increased on correct matches, (3) 5.72° ROI position error from the ground truth and (4) lower aggregated alignment-distortion error over existing alignment approaches. We discuss possible improvement points and future research directions.

Cite

CITATION STYLE

APA

Syawaludin, M. F., Kim, S., & Hwang, J. I. (2021). Planar-equirectangular image stitching. Electronics (Switzerland), 10(9). https://doi.org/10.3390/electronics10091126

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free