Two measures of dependence

18Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order a and the relative α-entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.

Cite

CITATION STYLE

APA

Lapidoth, A., & Pfister, C. (2019). Two measures of dependence. Entropy, 21(8). https://doi.org/10.3390/e21080778

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free