Very large-scale image retrieval based on local features

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Traditional image retrieval technology is pixel sensitive and with low fault tolerance. To overcome this deficiency, a novel method for large-scale image retrieval is proposed in this paper, which is especially suitable for images with kinds of interferences, such as rotation, pixel lost, watermarks, etc. First, local features of images are extracted to build a visual dictionary with weight, which is a new data structure developed from bag-of-words. In the retrieval process, we look up all the features extracted from the target image in the dictionary and create a single list of weight to get the result. We demonstrate the effectiveness of our approach using a coral image set and online image set on eBay. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Yin, C. Q., Mao, W., & Jiang, W. (2012). Very large-scale image retrieval based on local features. In Communications in Computer and Information Science (Vol. 304 CCIS, pp. 242–250). https://doi.org/10.1007/978-3-642-31837-5_36

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free