Efficient image retrieval via feature fusion and adaptive weighting

3Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In the community of content-based image retrieval (CBIR), single feature only describes specific aspect of image content, resulting in false positive matches prevalently returned as candidate retrieval results with low precision and recall. Typically, frequently-used SIFT feature only depicts local gradient distribution within ROIs of gray scale images lacking color information, and tends to produce limited retrieval performance. In order to tackle such problems, we propose a feature fusion method of integrating multiple diverse image features to gain more complementary and helpful image information. Furthermore, to represent the disparate powers of discrimination of image features, a dynamically updating Adaptive Weights Allocation Algorithm (AWAA) which rationally allocates fusion weights proportional to their contributions to matching is proposed in the paper. Extensive experiments on several benchmark datasets demonstrate that feature fusion simultaneously with adaptive weighting based image retrieval yields more accurate and robust retrieval results than conventional retrieval schema.

Cite

CITATION STYLE

APA

Shi, X., Guo, Z., & Zhang, D. (2016). Efficient image retrieval via feature fusion and adaptive weighting. In Communications in Computer and Information Science (Vol. 663, pp. 259–273). Springer Verlag. https://doi.org/10.1007/978-981-10-3005-5_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free