In developing algorithms that dynamically changes the structure and weights of ANN (Artificial Neural Networks), there must be a proper balance between network complexity and its generalization capability. SEPA addresses these issues using an encoding scheme where network weights and connections are encoded in matrices of real numbers. Network parameters are locally encoded and locally adapted with fitness evaluation consisting mainly of fast feed-forward operations. Experimental results in some well-known classification problems demonstrate SEPA's high consistency performance in classification, fast convergence, and good optimality of structure. © Springer-Verlag Berlin Heidelberg 2003.
CITATION STYLE
Palmes, P. P., Hayasaka, T., & Usui, S. (2003). SEPA: Structure evolution and parameter adaptation in feed-forward neural networks. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2724, 1600–1601. https://doi.org/10.1007/3-540-45110-2_44
Mendeley helps you to discover research relevant for your work.