Recently, a series of papers addressed the problem of decomposing the information of two random variables into shared information, unique information and synergistic information. Several measures were proposed, although still no consensus has been reached. Here, we compare these proposals with an older approach to define synergistic information based on the projections on exponential families containing only up to k-th order interactions. We show that these measures are not compatible with a decomposition into unique, shared and synergistic information if one requires that all terms are always non-negative (local positivity). We illustrate the difference between the two measures for multivariate Gaussians.
CITATION STYLE
Olbrich, E., Bertschinger, N., & Rauh, J. (2015). Information decomposition and synergy. Entropy, 17(5), 3501–3517. https://doi.org/10.3390/e17053501
Mendeley helps you to discover research relevant for your work.