Evolutionary network minimization: Adaptive implicit pruning of successful agents

4Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neurocontroller minimization is beneficial for constructing small parsimonious networks that permit a better understanding of their workings. This paper presents a novel, Evolutionary Network Minimization (ENM) algorithm which is applied to fully recurrent neurocontrollers. ENM is a simple, standard genetic algorithm with an additional step in which small weights are irreversibly eliminated. ENM has a unique combination of features which distinguish it from previous evolutionary minimization algorithms: 1. An explicit penalty term is not added to the fitness function. 2. Minimization begins after functional neurocontrollers have been successfully evolved. 3. Successful minimization relies solely on the workings of a drift that removes unimportant weights and, importantly, on continuing adaptive modifications of the magnitudes of the remaining weights. Our results testify that ENM is successful in extensively minimizing recurrent evolved neurocontrollers while keeping their fitness intact and maintaining their principal functional characteristics.

Cite

CITATION STYLE

APA

Ganon, Z., Keinan, A., & Ruppin, E. (2003). Evolutionary network minimization: Adaptive implicit pruning of successful agents. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 2801, pp. 319–327). Springer Verlag. https://doi.org/10.1007/978-3-540-39432-7_34

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free