Entropy-based image fusion with joint sparse representation and rolling guidance filter

21Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

Image fusion is a very practical technology that can be applied in many fields, such as medicine, remote sensing and surveillance. An image fusion method using multi-scale decomposition and joint sparse representation is introduced in this paper. First, joint sparse representation is applied to decompose two source images into a common image and two innovation images. Second, two initial weight maps are generated by filtering the two source images separately. Final weight maps are obtained by joint bilateral filtering according to the initial weight maps. Then, the multi-scale decomposition of the innovation images is performed through the rolling guide filter. Finally, the final weight maps are used to generate the fused innovation image. The fused innovation image and the common image are combined to generate the ultimate fused image. The experimental results show that our method's average metrics are: mutual information (MI)-5.3377, feature mutual information (FMI)-0.5600, normalized weighted edge preservation value (QAB/F)-0.6978 and nonlinear correlation information entropy (NCIE)-0.8226. Our method can achieve better performance compared to the state-of-the-art methods in visual perception and objective quantification.

Cite

CITATION STYLE

APA

Liu, Y., Yang, X., Zhang, R., Albertini, M. K., Celik, T., & Jeon, G. (2020). Entropy-based image fusion with joint sparse representation and rolling guidance filter. Entropy, 22(1), 118. https://doi.org/10.3390/e22010118

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free