Improved back propagation algorithm to avoid local minima in multiplicative neuron model

13Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The back propagation algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a training algorithm consisting of a learning rate and a momentum factor. The major drawbacks of above learning algorithm are the problems of local minima and slow convergence speeds. The addition of an extra term, called a proportional factor reduces the convergence of the back propagation algorithm. We have applied the three term back propagation to multiplicative neural network learning. The algorithm is tested on XOR and parity problem and compared with the standard back propagation training algorithm. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Burse, K., Manoria, M., & Kirar, V. P. S. (2011). Improved back propagation algorithm to avoid local minima in multiplicative neuron model. In Communications in Computer and Information Science (Vol. 147 CCIS, pp. 67–73). https://doi.org/10.1007/978-3-642-20573-6_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free