Learning query-specific distance functions for large-scale web image search

11Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Current Google image search adopt a hybrid search approach in which a text-based query (e.g., "Paris landmarks") is used to retrieve a set of relevant images, which are then refined by the user (e.g., by re-ranking the retrieved images based on similarity to a selected example). We conjecture that given such hybrid image search engines, learning per-query distance functions over image features can improve the estimation of image similarity. We propose scalable solutions to learning query-specific distance functions by 1) adopting a simple large-margin learning framework, 2) using the query-logs of text-based image search engine to train distance functions used in content-based systems. We evaluate the feasibility and efficacy of our proposed system through comprehensive human evaluation, and compare the results with the state-of-the-art image distance function used by Google image search. © 2013 IEEE.

Cite

CITATION STYLE

APA

Jing, Y., Covell, M., Tsai, D., & Rehg, J. M. (2013). Learning query-specific distance functions for large-scale web image search. IEEE Transactions on Multimedia, 15(8), 2022–2034. https://doi.org/10.1109/TMM.2013.2279663

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free