Abstract
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order a and the relative α-entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.
Author supplied keywords
Cite
CITATION STYLE
Lapidoth, A., & Pfister, C. (2019). Two measures of dependence. Entropy, 21(8). https://doi.org/10.3390/e21080778
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.