Optimizing Artificial Neural Network for Functions Approximation Using Particle Swarm Optimization

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Artificial neural networks (ANN) are commonly used in function approximation as well as classification problems. This paper shows a configurable architecture of a simple feed forward neural network trained by particle swarm optimization (PSO) algorithm. PSO and ANN have several hyperparameters that have impact on the results of approximation. ANN parameters are the number of layers, number of neurons in each layer, and neuron activation functions. The hyperparameters of the PSO are the population size, the number of informants per particle, and the acceleration coefficients. Herein, this work comes to spot the light on how the PSO hyperparameters affect the ability of the algorithm to optimize ANNs weights in the function approximation task. This was examined and tested by generating multiple experiments on different types of input functions such as: cubic, linear, XOR problem. The results of the proposed method show the superiority of PSO compared to backpropagation in terms of MSE.

Cite

CITATION STYLE

APA

Zaghloul, L., Zaghloul, R., & Hamdan, M. (2021). Optimizing Artificial Neural Network for Functions Approximation Using Particle Swarm Optimization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12689 LNCS, pp. 223–231). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-78743-1_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free