Image fusion is the process of combining multiple images of a same scene to single high-quality image which has more information than any of the input images. In this paper, we propose a new fusion approach in a spatial domain using propagated image filter. The proposed approach calculates the weight map of every input image using the propagated image filter and gradient domain postprocessing. Propagated image filter exploits cumulative weight construction approach for filtering operation.We show that the proposed approach is able to achieve state-of-the-art results for the problem of multi-exposure fusion for various types of indoor and outdoor natural static scenes with varying amounts of dynamic range.
CITATION STYLE
Patel, D., Sonane, B., & Raman, S. (2017). Multi-exposure image fusion using propagated image filtering. In Advances in Intelligent Systems and Computing (Vol. 459 AISC, pp. 431–441). Springer Verlag. https://doi.org/10.1007/978-981-10-2104-6_39
Mendeley helps you to discover research relevant for your work.