A generic approach to accelerating belief propagation based incomplete algorithms for DCOPs via a branch-and-bound technique

7Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

Belief propagation approaches, such as Max-Sum and its variants, are important methods to solve large-scale Distributed Constraint Optimization Problems (DCOPs). However, for problems with n-ary constraints, these algorithms face a huge challenge since their computational complexity scales exponentially with the number of variables a function holds. In this paper, we present a generic and easy-to-use method based on a branch-and-bound technique to solve the issue, called Function Decomposing and State Pruning (FDSP). We theoretically prove that FDSP can provide monotonically non-increasing upper bounds and speed up belief propagation based incomplete DCOP algorithms without an effect on solution quality. Also, our empirically evaluation indicates that FDSP can reduce 97% of the search space at least and effectively accelerate Max-Sum, compared with the state-of-the-art.

Cite

CITATION STYLE

APA

Chen, Z., Jiang, X., Deng, Y., Chen, D., & He, Z. (2019). A generic approach to accelerating belief propagation based incomplete algorithms for DCOPs via a branch-and-bound technique. In 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019 (pp. 6038–6045). AAAI Press. https://doi.org/10.1609/aaai.v33i01.33016038

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free