Zero-Shot Learning for Reflection Removal of Single 360-Degree Image

N/ACitations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The existing methods for reflection removal mainly focus on removing blurry and weak reflection artifacts and thus often fail to work with severe and strong reflection artifacts. However, in many cases, real reflection artifacts are sharp and intensive enough such that even humans cannot completely distinguish between the transmitted and reflected scenes. In this paper, we attempt to remove such challenging reflection artifacts using 360-Degree images. We adopt the zero-shot learning scheme to avoid the burden of collecting paired data for supervised learning and the domain gap between different datasets. We first search for the reference image of the reflected scene in a 360-degree image based on the reflection geometry, which is then used to guide the network to restore the faithful colors of the reflection image. We collect 30 test 360-Degree images exhibiting challenging reflection artifacts and demonstrate that the proposed method outperforms the existing state-of-the-art methods on 360-Degree images.

Cite

CITATION STYLE

APA

Han, B. J., & Sim, J. Y. (2022). Zero-Shot Learning for Reflection Removal of Single 360-Degree Image. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13679 LNCS, pp. 533–548). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-19800-7_31

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free