Preventing premature convergence in a simple EDA via global step size setting

24Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

When a simple real-valued estimation of distribution algorithm (EDA) with Gaussian model and maximum likelihood estimation of parameters is used, it converges prematurely even on the slope of the fitness function. The simplest way of preventing premature convergence by multiplying the variance estimate by a constant factor k each generation is studied. Recent works have shown that when increasing the dimensionality of the search space, such an algorithm becomes very quickly unable to traverse the slope and focus to the optimum at the same time. In this paper it is shown that when isotropic distributions with Gaussian or Cauchy distributed norms are used, the simple constant setting of k is able to ensure a reasonable behaviour of the EDA on the slope and in the valley of the fitness function at the same time. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Pošík, P. (2008). Preventing premature convergence in a simple EDA via global step size setting. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5199 LNCS, pp. 549–558). https://doi.org/10.1007/978-3-540-87700-4_55

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free