Upper Bounds on the Running Time of the Univariate Marginal Distribution Algorithm on OneMax

25Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Univariate Marginal Distribution Algorithm (UMDA) is a randomized search heuristic that builds a stochastic model of the underlying optimization problem by repeatedly sampling λ solutions and adjusting the model according to the best μ samples. We present a running time analysis of the UMDA on the classical OneMax benchmark function for wide ranges of the parameters μ and λ. If μ≥ clog n for some constant c> 0 and λ= (1 + Θ(1)) μ, we obtain a general bound O(μn) on the expected running time. This bound crucially assumes that all marginal probabilities of the algorithm are confined to the interval [1 / n, 1 - 1 / n]. If μ≥c′nlogn for a constant c ′ > 0 and λ= (1 + Θ(1)) μ, the behavior of the algorithm changes and the bound on the expected running time becomes O(μn), which typically holds even if the borders on the marginal probabilities are omitted. The results supplement the recently derived lower bound Ω(μn+nlogn) by Krejca and Witt (Proceedings of FOGA 2017, ACM Press, New York, pp 65–79, 2017) and turn out to be tight for the two very different choices μ= clog n and μ=c′nlogn. They also improve the previously best known upper bound O(nlog nlog log n) by Dang and Lehre (Proceedings of GECCO ’15, ACM Press, New York, pp 513–518, 2015) that was established for μ= clog n and λ= (1 + Θ(1)) μ.

Cite

CITATION STYLE

APA

Witt, C. (2019). Upper Bounds on the Running Time of the Univariate Marginal Distribution Algorithm on OneMax. Algorithmica, 81(2), 632–667. https://doi.org/10.1007/s00453-018-0463-0

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free