Preventing premature convergence to local optima in genetic algorithms via random offspring generation

77Citations
Citations of this article
86Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Genetic Algorithms (GAs) paradigm is being used in- creasingly in search and optimization problems. The method has shown to be effcient and robust in a considerable number of scientific domains, where the complexity and cardinality of the problems considered elected themselves as key factors to be taken into account. However, there are still some insuffciencies; indeed, one of the major problems usually as- sociated with the use of GAs is the premature convergence to solutions coding local optima of the objective function. The problem is tightly re- lated with the loss of genetic diversity of the GA's population, being the cause of a decrease on the quality of the solutions found. Out of question, this fact has lead to the development of different techniques aiming to solve, or at least to minimize the problem; traditional methods usually work to maintain a certain degree of genetic diversity on the target po- pulations, without affecting the convergence process of the GA. In one's work, some of these techniques are compared and an innovative one, the Random Offspring Generation, is presented and evaluated in its merits. The Traveling Salesman Problem is used as a benchmark.

Cite

CITATION STYLE

APA

Rocha, M., & Neves, J. (1999). Preventing premature convergence to local optima in genetic algorithms via random offspring generation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1611, pp. 127–136). Springer Verlag. https://doi.org/10.1007/978-3-540-48765-4_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free