Evaluation of Jensen-Shannon distance over sparse data

5Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Jensen-Shannon divergence is a symmetrised, smoothed version of Küllback-Leibler. It has been shown to be the square of a proper distance metric, and has other properties which make it an excellent choice for many high-dimensional spaces in ℝ*. The metric as defined is however expensive to evaluate. In sparse spaces over many dimensions the Intrinsic Dimensionality of the metric space is typically very high, making similarity-based indexing ineffectual. Exhaustive searching over large data collections may be infeasible. Using a property that allows the distance to be evaluated from only those dimensions which are non-zero in both arguments, and through the identification of a threshold function, we show that the cost of the function can be dramatically reduced. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Connor, R., Cardillo, F. A., Moss, R., & Rabitti, F. (2013). Evaluation of Jensen-Shannon distance over sparse data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8199 LNCS, pp. 163–168). https://doi.org/10.1007/978-3-642-41062-8_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free