Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and information distance. We also study the relationship between entropy and the normalized versions of information distances.
CITATION STYLE
Hu, B., Bi, L., & Dai, S. (2017). Information distances versus entropy metric. Entropy, 19(6). https://doi.org/10.3390/e19060260
Mendeley helps you to discover research relevant for your work.