Elastic learning rate on error backpropagation of online update

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The error-backpropagation (EBP) algorithm for learning multilayer perceptrons (MLPs) is known to have good features of robustness and economical efficiency. However, the algorithm has difficulty in selecting an optimal constant learning rate and thus results in non-optimal learning speed and inflexible operation for working data. This paper introduces an elastic learning rate that guarantees convergence of learning and its local realization by online update of MLP parameters into the original EBP algorithm in order to complement the non-optimality. The results of experiments on a speaker verification system with Korean speech database are presented and discussed to demonstrate the performance improvement of the proposed method in terms of learning speed and flexibility for working data of the original EBP algorithm. © Springer-Verlag Berlin Heidelberg 2004.

Cite

CITATION STYLE

APA

Lee, T. S., & Choi, H. J. (2004). Elastic learning rate on error backpropagation of online update. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3157, pp. 272–281). Springer Verlag. https://doi.org/10.1007/978-3-540-28633-2_30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free