We consider the problem of quantifying the information shared by a pair of random variables X1, X2 about another variable S. We propose a new measure of shared information, called extractable shared information, that is left monotonic; that is, the information shared about S is bounded from below by the information shared about f (S) for any function f . We show that our measure leads to a new nonnegative decomposition of the mutual information I(S; X1X2) into shared, complementary and unique components. We study properties of this decomposition and show that a left monotonic shared information is not compatible with a Blackwell interpretation of unique information. We also discuss whether it is possible to have a decomposition in which both shared and unique information are left monotonic.
CITATION STYLE
Rauh, J., Banerjee, P. K., Olbrich, E., Jost, J., & Bertschinger, N. (2017). On extractable shared information. Entropy, 19(7). https://doi.org/10.3390/e19070328
Mendeley helps you to discover research relevant for your work.