Scalable indexing for perceptual data

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In recent years, multimedia objects such as images, video, and audio are becoming increasingly widespread. Many applications require content-based retrieval to be performed, and measurement of distance is a key component in such scenarios. The nature of multimedia requires perceptual similarity to be captured when computing distance between objects. Measures such as the Euclidean distance, which utilize all attributes of a pair of objects, do not perform very well. Instead, distance measures that use partial matches between objects have been found to perform significantly better. This is because, two multimedia objects can be considered perceptually similar when some respects closely match, even when they are very different in other respects. Existing distance measures that capture partial similarity have limitations, such as their non-metric nature, which makes scalable indexing challenging. In this paper, we propose the Partial Match Function, a distance measure that performs well for perceptual data, and allows efficient indexing. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Qamra, A., & Chang, E. Y. (2007). Scalable indexing for perceptual data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4577 LNCS, pp. 24–32). Springer Verlag. https://doi.org/10.1007/978-3-540-73417-8_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free