EDAs cannot be Balanced and Stable

45Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Estimation of Distribution Algorithms (EDAs) work by it-eratively updating a distribution over the search space with the help of samples from each iteration. Up to now, theoretical analyses of EDAs are scarce and present run time results for specific EDAs. We propose a new framework for EDAs that captures the idea of several known optimizers, including PBIL, UMDA, λ-MMASIB, cGA, and (1,λ)-EA. Our focus is on analyzing two core features of EDAs: a balanced EDA is sensitive to signals in the fitness; a stable EDA remains uncommitted under a biasless fitness function. We prove that no EDA can be both balanced and stable. The LEADINGONES function is a prime example where, at the beginning of the optimization, the fitness function shows no bias for many bits. Since many well-known EDAs are balanced and thus not stable, they are not well-suited to optimize LEADINGONES. We give a stable EDA which optimizes LEADINGONES within a time of O(nlogn).

Cite

CITATION STYLE

APA

Friedrich, T., Kötzing, T., & Krejca, M. S. (2016). EDAs cannot be Balanced and Stable. In GECCO 2016 - Proceedings of the 2016 Genetic and Evolutionary Computation Conference (pp. 1139–1146). Association for Computing Machinery, Inc. https://doi.org/10.1145/2908812.2908895

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free