Quantifying Synergistic Mutual Information

  • Griffith V
  • Koch C
N/ACitations
Citations of this article
132Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Quantifying cooperation or synergy among random variables in predicting a single target random variable is an important problem in many complex systems. We review three prior information-theoretic measures of synergy and introduce a novel synergy measure defined as the difference between the whole and the union of its parts. We apply all four measures against a suite of binary circuits to demonstrate that our measure alone quantifies the intuitive concept of synergy across all examples. We show that for our measure of synergy that independent predictors can have positive redundant information.

Cite

CITATION STYLE

APA

Griffith, V., & Koch, C. (2014). Quantifying Synergistic Mutual Information (pp. 159–190). https://doi.org/10.1007/978-3-642-53734-9_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free