Error bounds between marginal probabilities and beliefs of loopy belief propagation algorithm

2Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Belief propagation (BP) algorithm has been becoming increasingly a popular method for probabilistic inference on general graphical models. When networks have loops, it may not converge and, even if converges, beliefs, i.e., the result of the algorithm, may not be equal to exact marginal probabilities. When networks have loops, the algorithm is called Loopy BP (LBP). Tatikonda and Jordan applied Gibbs measures theory to LBP algorithm and derived a sufficient convergence condition. In this paper, we utilize Gibbs measure theory to investigate the discrepancy between a marginal probability and the corresponding belief. Consequently, in particular, we obtain an error bound if the algorithm converges under a certain condition. It is a general result for the accuracy of the algorithm. We also perform numerical experiments to see the effectiveness of the result. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Taga, N., & Mase, S. (2006). Error bounds between marginal probabilities and beliefs of loopy belief propagation algorithm. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4293 LNAI, pp. 186–196). Springer Verlag. https://doi.org/10.1007/11925231_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free