Differential evolution for neural networks optimization

47Citations
Citations of this article
54Readers
Mendeley users who have this article in their library.

Abstract

In this paper, a Neural Networks optimizer based on Self-adaptive Differential Evolution is presented. This optimizer applies mutation and crossover operators in a new way, taking into account the structure of the network according to a per layer strategy. Moreover, a new crossover called interm is proposed, and a new self-adaptive version of DE called MAB-ShaDE is suggested to reduce the number of parameters. The framework has been tested on some well-known classification problems and a comparative study on the various combinations of self-adaptive methods, mutation, and crossover operators available in literature is performed. Experimental results show that DENN reaches good performances in terms of accuracy, better than or at least comparable with those obtained by backpropagation.

Cite

CITATION STYLE

APA

Baioletti, M., Di Bari, G., Milani, A., & Poggioni, V. (2020). Differential evolution for neural networks optimization. Mathematics, 8(1). https://doi.org/10.3390/math8010069

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free