Simultaneous evolution of neural network topologies and weights for classification and regression

16Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Artificial Neural Networks (ANNs) are important Data Mining (DM) techniques. Yet, the search for the optimal ANN is a challenging task: the architecture should learn the input-output mapping without overfitting the data and training algorithms tend to get trapped into local minima. Under this scenario, the use of Evolutionary Computation (EC) is a promising alternative for ANN design and training. Moreover, since EC methods keep a pool of solutions, an ensemble can be build by combining the best ANNs. This work presents a novel algorithm for the optimization of ANNs, using a direct representation, a structural mutation operator and Lamarckian evolution. Sixteen real-world classification/regression tasks were used to test this strategy with single and ensemble based versions. Competitive results were achieved when compared with a heuristic model selection and other DM algorithms. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Rocha, M., Cortez, P., & Neves, J. (2005). Simultaneous evolution of neural network topologies and weights for classification and regression. In Lecture Notes in Computer Science (Vol. 3512, pp. 59–66). Springer Verlag. https://doi.org/10.1007/11494669_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free