A novel learning algorithm for feedforward neural networks

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A novel learning algorithm called BPWA for feedforward neural networks is presented, which adjusts the weights during both forward phase and backward phase. It calculates the minimum norm square solution as the weights between the hidden layer and output layer in the forward pass, while the backward pass adjusts the weights connecting the input layer to hidden layer according to error gradient descent algorithm. The algorithm is compared with Extreme learning Machine, BP algorithm and LMBP algorithm on function approximation and classification tasks. The experiments' results demonstrate that the proposed algorithm performs well. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Chen, H., & Jin, F. (2006). A novel learning algorithm for feedforward neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3971 LNCS, pp. 509–514). Springer Verlag. https://doi.org/10.1007/11759966_76

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free