Training a feed-forward neural network using artificial bee colony with back-propagation algorithm

13Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Training a feed-forward neural network (FNN) is an optimization problem over continuous space. Back-propagation algorithm (BP) is the conventional and most popular gradient-based local search optimization technique. The major problem more often BP suffers is the poor generalization performance by getting stuck at local minima. The artificial bee colony (ABC) is one of the popular global optimization algorithms of swarm intelligence and is used to train the weights of the neural network, but it also suffers from slow convergence speed. Nevertheless, a hybrid algorithm by combining artificial bee colony and backpropagation (ABC-BP) is proposed to train the FNN. The results of the proposed algorithm are compared with hybrid real-coded genetic algorithms with backpropagation (GA-BP) to train the FNN using five benchmark datasets taken from the UCI machine learning repository. The simulation results indicate that ABC-BP hybrid algorithm gives promising results in terms of significantly improved convergence rate and classification rate. Hence, the proposed algorithm can be efficiently used for training the FNN.

Cite

CITATION STYLE

APA

Sarangi, P. P., Sahu, A., & Panda, M. (2014). Training a feed-forward neural network using artificial bee colony with back-propagation algorithm. In Advances in Intelligent Systems and Computing (Vol. 243, pp. 511–519). Springer Verlag. https://doi.org/10.1007/978-81-322-1665-0_49

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free