Multi-exposure image fusion using propagated image filtering

N/ACitations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Image fusion is the process of combining multiple images of a same scene to single high-quality image which has more information than any of the input images. In this paper, we propose a new fusion approach in a spatial domain using propagated image filter. The proposed approach calculates the weight map of every input image using the propagated image filter and gradient domain postprocessing. Propagated image filter exploits cumulative weight construction approach for filtering operation.We show that the proposed approach is able to achieve state-of-the-art results for the problem of multi-exposure fusion for various types of indoor and outdoor natural static scenes with varying amounts of dynamic range.

Cite

CITATION STYLE

APA

Patel, D., Sonane, B., & Raman, S. (2017). Multi-exposure image fusion using propagated image filtering. In Advances in Intelligent Systems and Computing (Vol. 459 AISC, pp. 431–441). Springer Verlag. https://doi.org/10.1007/978-981-10-2104-6_39

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free