Information distances versus entropy metric

8Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and information distance. We also study the relationship between entropy and the normalized versions of information distances.

Cite

CITATION STYLE

APA

Hu, B., Bi, L., & Dai, S. (2017). Information distances versus entropy metric. Entropy, 19(6). https://doi.org/10.3390/e19060260

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free