Neural networks training with optimal bounded ellipsoid algorithm

7Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Compared to normal learning algorithms, for example backpropagation, the optimal bounded ellipsoid (OBE) algorithm has some better properties, such as faster convergence, since it has a similar structure as Kalman filter. OBE has some advantages over Kalman filter training, the noise is not required to be Guassian. In this paper OBE algorithm is applied traing the weights of recurrent neural networks for nonlinear system identification. Both hidden layers and output layers can be updated. From a dynamic systems point of view, such training can be useful for all neural network applications requiring real-time updating of the weights. A simple simulation gives the effectiveness of the suggested algorithm. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

De Jesus Rubio, J., & Yu, W. (2007). Neural networks training with optimal bounded ellipsoid algorithm. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4491 LNCS, pp. 1173–1182). Springer Verlag. https://doi.org/10.1007/978-3-540-72383-7_137

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free