Informed Lifting for Message-Passing

9Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

Abstract

Lifted inference, handling whole sets of indistinguishable objects together, is critical to the effective application of probabilistic relational models to realistic real world tasks. Recently, lifted belief propagation (LBP) has been proposed as an efficient approximate solution of this inference problem. It runs a modified BP on a lifted network where nodes have been grouped together if they have - roughly speaking - identical computation trees, the tree-structured unrolling of the underlying graph rooted at the nodes. In many situations, this purely syntactic criterion is too pessimistic: message errors decay along paths. Intuitively, for a long chain graph with weak edge potentials, distant nodes will send and receive identical messages yet their computation trees are quite different. To overcome this, we propose iLBP, a novel, easy-to-implement, informed LBP approach that interleaves lifting and modified BP iterations. In turn, we can efficiently monitor the true BP messages sent and received in each iteration and group nodes accordingly. As our experiments show, iLBP can yield significantly faster more lifted network while not degrading performance. Above all, we show that iLBP is faster than BP when solving the problem of distributing data to a large network, an important real-world application where BP is faster than uninformed LBP.

Cite

CITATION STYLE

APA

Kersting, K., El Massaoudi, Y., Ahmadi, B., & Hadiji, F. (2010). Informed Lifting for Message-Passing. In Proceedings of the 24th AAAI Conference on Artificial Intelligence, AAAI 2010 (pp. 1181–1186). AAAI Press. https://doi.org/10.1609/aaai.v24i1.7759

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free