Belief Propagation, Mean-Field, and Bethe Approximations

  • Yuille A
N/ACitations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

This chapter describes methods for estimating the marginals and max-imum a posteriori (MAP) estimates of probability distributions defined over graphs by approximate methods including Mean Field Theory (MFT), variational methods, and belief propagation. These methods typically formulate this problem in terms of minimizing a free energy function of pseudomarginals. They differ by the design of the free energy and the choice of algorithm to minimize it. These algorithms can often be interpreted in terms of message passing. In many cases, the free energy has a dual formulation and the algorithms are defined over the dual variables (e.g., the messages in belief propagation). The quality of performance depends on the types of free energies used – specifically how well they approximate the log partition function of the probabil-ity distribution – and whether there are suitable algorithms for finding their minima. We start in section (II) by introducing two types of Markov Field models that are often used in computer vision. We pro-ceed to define MFT/variational methods in section (III), whose free energies are lower bounds of the log partition function, and describe how inference can be done by expectation-maximization, steepest descent, or discrete iterative algorithms. The following section (IV) describes message passing algorithms, such as belief propagation and its generalizations, which can be related to free energy functions (and dual variables). Finally in section (V) we describe how these methods relate to Markov Chain Monte Carlo (MCMC) approaches, which gives a different way to think of these methods and which can lead to novel algorithms. 0.2 Two Models We start by presenting two important probabilistic vision models which will be used to motivate the algorithms described in the rest of the section. The first type of model is formulated as a standard Markov Ran-dom Field (MRF) with input z and output x. We will describe two STENNING: " FINALCHAPTER " — 2010/1/24 — 18:31 — PAGE 2 — #2 2 0 vision applications for this model. The first application is image label-ing where z = {z i : i ∈ D} specifies the intensity values z i ∈ {0, 255} on the image lattice D and x = {x i : i ∈ D} is a set of image labels x i ∈ L, see figure (1). The nature of the labels will depend on the problem. For edge detection, |L| = 2 and the labels l 1 , l 2 will correspond to 'edge' and 'non-edge'. For labeling the MSRC dataset [36] |L| = 23 and the labels l 1 , ..., l 23 include 'sky', 'grass', and so on. A second application is binocular stereo, see figure (2), where the input is the input images to the left and right cameras, z = (z L

Cite

CITATION STYLE

APA

Yuille, A. (2009). Belief Propagation, Mean-Field, and Bethe Approximations. Unspecified, 1(2), 1–14.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free