We consider the problem of partitioning n integers into two subsets of given cardinalities such that the discrepancy, the absolute value of the difference of their sums, is minimized. The integers are i.i.d. random variables chosen uniformly from the set {1, . . . , M}. We study how the typical behavior of the optimal partition depends on n, M and the bias s, the difference between the cardinalities of the two subsets in the partition. In particular, we rigorously establish this typical behavior as a function of the two parameters κ := n-1 log2 M and b := |s|/n by proving the existence of three distinct "phases" in the κb-plane, characterized by the value of the discrepancy and the number of optimal solutions: a "perfect phase" with exponentially many optimal solutions with discrepancy 0 or 1; a "hard phase" with minimal discrepancy of order Me-⊖(n); and a "sorted phase" with an unique optimal partition of order Mn, obtained by putting the (s + n)/2 smallest integers in one subset. © Springer-Verlag 2004.
CITATION STYLE
Borgs, C., Chayes, J. T., Mertens, S., & Pittel, B. (2004). Constrained integer partitions. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2976, 59–68. https://doi.org/10.1007/978-3-540-24698-5_10
Mendeley helps you to discover research relevant for your work.