Using noise to speed up Markov chain Monte Carlo estimation

Citations of this article
Mendeley users who have this article in their library.


Carefully injected noise can speed the average convergence of Markov chain Monte Carlo (MCMC) simulation estimates. This includes the MCMC special cases of the Metropolis-Hastings algorithm and Gibbs sampling and simulated annealing. MCMC equates the solution to a computational problem with the equilibrium probability density of a reversible Markov chain. The algorithm must cycle through a long burn-in phase until it reaches equilibrium because the Markov samples are statistically correlated. The injected noise reduces this burn-in period. Simulations showed that optimal noise gave a 42% speed-up in finding the minimum potential energy of diatomic argon using a Lennard-Jones 12-6 potential. We prove that the Noisy MCMC algorithm brings each Markov step closer on average to equilibrium if an inequality holds between two expectations. Gaussian or Cauchy jump probabilities reduce the inequality to a simple quadratic condition.




Franzke, B., & Kosko, B. (2015). Using noise to speed up Markov chain Monte Carlo estimation. In Procedia Computer Science (Vol. 53, pp. 113–120). Elsevier B.V.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free