Entropy and cloning methods for combinatorial optimization, sampling and counting using the gibbs sampler

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We survey the latest developments in the indicator-based minimum cross-entropy and the MCMC methods for combinatorial optimization (COP's), counting, sampling and rare-event probability estimation as well as we present some new material. The main idea of the indicator-based minimum cross-entropy method, called the icator MinxEnt simply IME, is to associate with each counting or optimization problem an auxiliary single-constrained convex MinxEnt program of a special type, which has a closed-form solution. The main idea of the MCMC approach is to design a sequential sampling plan, where the difficult problem of estimating rare-event probability and counting the cardinality of a set is decomposed into easy problems of counting the cardinality of a sequence of related sets. Here we also propose a new algorithm, called the cloning algorithm. The main differences between the existing and the proposed algorithm is that the latter one has a special device, called the cloning device, which makes the algorithm very fast and accurate. We present efficient numerical results, while solving quite general integer and combinatorial optimization problems as well as counting ones, like SAT and Hamiltonian cycles. © 2009 Springer US.

Cite

CITATION STYLE

APA

Rubinstein, R. (2009). Entropy and cloning methods for combinatorial optimization, sampling and counting using the gibbs sampler. In Information Theory and Statistical Learning (pp. 385–434). Springer US. https://doi.org/10.1007/978-0-387-84816-7_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free