An estimator of mutual information and its application to independence testing

8Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

This paper proposes a novel estimator of mutual information for discrete and continuous variables. The main feature of this estimator is that it is zero for a large sample size n if and only if the two variables are independent. The estimator can be used to construct several histograms, compute estimations of mutual information, and choose the maximum value. We prove that the number of histograms constructed has an upper bound of O(log n) and apply this fact to the search. We compare the performance of the proposed estimator with an estimator of the Hilbert-Schmidt independence criterion (HSIC), though the proposed method is based on the minimum description length (MDL) principle and the HSIC provides a statistical test. The proposed method completes the estimation in O(n log n) time, whereas the HSIC kernel computation requires O(n3) time. We also present examples in which the HSIC fails to detect independence but the proposed method successfully detects it.

Cite

CITATION STYLE

APA

Suzuki, J. (2016). An estimator of mutual information and its application to independence testing. Entropy, 18(4). https://doi.org/10.3390/e18040109

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free