Quantifying cooperation or synergy among random variables in predicting a single target random variable is an important problem in many complex systems. We review three prior information-theoretic measures of synergy and introduce a novel synergy measure defined as the difference between the whole and the union of its parts. We apply all four measures against a suite of binary circuits to demonstrate that our measure alone quantifies the intuitive concept of synergy across all examples. We show that for our measure of synergy that independent predictors can have positive redundant information.
CITATION STYLE
Griffith, V., & Koch, C. (2014). Quantifying Synergistic Mutual Information (pp. 159–190). https://doi.org/10.1007/978-3-642-53734-9_6
Mendeley helps you to discover research relevant for your work.