SEPA: Structure evolution and parameter adaptation in feed-forward neural networks

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In developing algorithms that dynamically changes the structure and weights of ANN (Artificial Neural Networks), there must be a proper balance between network complexity and its generalization capability. SEPA addresses these issues using an encoding scheme where network weights and connections are encoded in matrices of real numbers. Network parameters are locally encoded and locally adapted with fitness evaluation consisting mainly of fast feed-forward operations. Experimental results in some well-known classification problems demonstrate SEPA's high consistency performance in classification, fast convergence, and good optimality of structure. © Springer-Verlag Berlin Heidelberg 2003.

Cite

CITATION STYLE

APA

Palmes, P. P., Hayasaka, T., & Usui, S. (2003). SEPA: Structure evolution and parameter adaptation in feed-forward neural networks. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2724, 1600–1601. https://doi.org/10.1007/3-540-45110-2_44

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free