Exact test of independence using mutual information

28Citations
Citations of this article
51Readers
Mendeley users who have this article in their library.

Abstract

Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The exact tests reported in the literature assume that data samples for each variable are sequentially independent and identically distributed (iid). In general, time series data have dependencies (Markov structure) that violate this condition. The algorithm given in this paper is the first exact significance test of mutual information that takes into account the Markov structure. When the Markov order is not known or indefinite, an exact test is used to determine an effective Markov order. ©2014 by the authors; licensee MDPI, Basel, Switzerland.

Cite

CITATION STYLE

APA

Pethel, S. D., & Hahs, D. W. (2014). Exact test of independence using mutual information. Entropy, 16(5), 2839–2849. https://doi.org/10.3390/e16052839

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free