This work investigates the intersection property of conditional independence. It states that for random variables $$A,B,C$$ and X we have that $$X \bot \bot A{\kern 1pt} {\kern 1pt} |{\kern 1pt} {\kern 1pt} B,C$$ and $$X\, \bot \bot\, B{\kern 1pt} {\kern 1pt} |{\kern 1pt} {\kern 1pt} A,C$$ implies $$X\, \bot \bot\, (A,B){\kern 1pt} {\kern 1pt} |{\kern 1pt} {\kern 1pt} C$$ . Here, “ $$ \bot \bot $$ ” stands for statistical independence. Under the assumption that the joint distribution has a density that is continuous in $$A,B$$ and C , we provide necessary and sufficient conditions under which the intersection property holds. The result has direct applications to causal inference: it leads to strictly weaker conditions under which the graphical structure becomes identifiable from the joint distribution of an additive noise model.
CITATION STYLE
Peters, J. (2014). On the Intersection Property of Conditional Independence and its Application to Causal Discovery. Journal of Causal Inference, 3(1), 97–108. https://doi.org/10.1515/jci-2014-0015
Mendeley helps you to discover research relevant for your work.