Hierarchical Integrated ColorMatching in a Stereoscopic Image based on Image Decomposition

1Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Color discrepancies between the left and right image in a stereoscopic image cause many problems, including a reduction of the three-dimensional effect and increased visual fatigue. Thus, color matching in a stereoscopic image is very important for three-dimensional display systems. Therefore, a hierarchical integrated color matching method based on image decomposition is proposed for stereoscopic images. In the proposed method, global and local color discrepancies generated in a stereoscopic image are effectively reduced by histogram matching and illuminant estimation using image decomposition. The stereoscopic image is first decomposed into a base layer and several texture layers. Each decomposed layer is then matched using cumulative histogram matching and a multi-scale retinex algorithm. Lastly, inverse decomposition is applied to each layer to reconstruct the corrected stereoscopic image. Experimental results show that the proposed method has a better color matching performance in comparison with previous methods.

Cite

CITATION STYLE

APA

Ha, H. G., Subhashdas, S. K., Choi, B. S., & Ha, Y. H. (2015). Hierarchical Integrated ColorMatching in a Stereoscopic Image based on Image Decomposition. In Final Program and Proceedings - IS and T/SID Color Imaging Conference (Vol. 2015-January, pp. 29–35). Society for Imaging Science and Technology. https://doi.org/10.2352/J.ImagingSci.Technol.2015.59.3.030402

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free