Lifted inference for convex quadratic programs

7Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Symmetry is the essential element of lifted inference that has recently demonstrated the possibility to perform very efficient inference in highly-connected, but symmetric probabilistic models. This raises the question, whether this holds for optimization problems in general. Here we show that for a large class of optimization methods this is actually the case. Specifically, we introduce the concept of fractional symmetries of convex quadratic programs (QPs), which lie at the heart of many AI and machine learning approaches, and exploit it to lift, i.e., to compress QPs. These lifted QPs can then be tackled with the usual optimization toolbox (off-the-shelf solvers, cutting plane algorithms, stochastic gradients etc.). If the original QP exhibits symmetry, then the lifted one will generally be more compact, and hence more efficient to solve.

Cite

CITATION STYLE

APA

Mladenov, M., Kleinhans, L., & Kersting, K. (2017). Lifted inference for convex quadratic programs. In 31st AAAI Conference on Artificial Intelligence, AAAI 2017 (pp. 2350–2356). AAAI press. https://doi.org/10.1609/aaai.v31i1.10841

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free