Multivariate dependence beyond Shannon information

51Citations
Citations of this article
193Readers
Mendeley users who have this article in their library.

Abstract

Accurately determining dependency structure is critical to understanding a complex system's organization. We recently showed that the transfer entropy fails in a key aspect of this-measuring information flow-due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that Shannon information measures (entropy and mutual information, in their conditional and multivariate forms) can fail to accurately ascertain multivariate dependencies due to their conflation of qualitatively different relations among variables. This has broad implications, particularly when employing information to express the organization and mechanisms embedded in complex systems, including the burgeoning efforts to combine complex network theory with information theory. Here, we do not suggest that any aspect of information theory is wrong. Rather, the vast majority of its informational measures are simply inadequate for determining the meaningful relationships among variables within joint probability distributions. We close by demonstrating that such distributions exist across an arbitrary set of variables.

Cite

CITATION STYLE

APA

James, R. G., & Crutchfield, J. P. (2017). Multivariate dependence beyond Shannon information. Entropy, 19(10). https://doi.org/10.3390/e19100531

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free