Efficient probabilistic inference is key to the success of statistical relational learning. One issue that increases the cost of inference is the presence of irrelevant random variables. The Bayes-ball algorithm can identify the requisite variables in a propositional Bayesian network and thus ignore irrelevant variables. This paper presents a lifted version of Bayes-ball, which works directly on the first-order level, and shows how this algorithm applies to (lifted) inference in directed first-order probabilistic models. © 2010 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Meert, W., Taghipour, N., & Blockeel, H. (2010). First-order Bayes-ball. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6322 LNAI, pp. 369–384). https://doi.org/10.1007/978-3-642-15883-4_24
Mendeley helps you to discover research relevant for your work.