Reducing Symmetry in Matrix Models

  • Kiziltan Z
N/ACitations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Symmetry in a CSP is a permutation of variables, or the values in the domains, or both which preserve the state of the search: either all of them lead to a solution or none does. Hence, elimination of symmetry is essential to avoid exploring equivalent branches in a search tree. An important class of symmetries in constraint programming arises from matrices of decision variables where any two rows can be interchanged, as well as any two columns. Eliminating all such symmetries is not so easy as the effort required may be exponential. We are thus interested in reducing significant amount of row and column symmetries in matrix models with a polynomial effort. In this respect, we have shown that lexicographically ordering both rows and columns of a matrix model reduces much of such symmetries. For an n x n matrix model with row and column symmetry, 0(n) lexicographic constraints between adjacent rows and columns are imposed. We have shown that decomposing a lexicographic ordering constraint between a pair of vectors carries a penalty either in the amount or the cost of constraint propagation. We have therefore developed a linear-time global-consistency algorithm which enforces a lexicographic ordering between two vectors. Our experiments confirm the efficiency and value of this new global constraint

Cite

CITATION STYLE

APA

Kiziltan, Z. (2002). Reducing Symmetry in Matrix Models (pp. 786–786). https://doi.org/10.1007/3-540-46135-3_80

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free