LS-Backpropagation Algorithm for Training Multilayer Perceptrons

  • Di Claudio E
  • Parisi R
  • Orlandi G
N/ACitations
Citations of this article
33Readers
Mendeley users who have this article in their library.
Get full text

Abstract

this report also have been published on ESANN '93 [Schiffmann et al., 1993]. The dataset used in this comparision is available by anonymous ftp (FTP server: ics.uci.edu, files: pub/machine-learningdatabases /thyroid-disease/ann). Backpropagation is one of the most popular training algorithms for multilayer perceptrons. Unfortunately it can be very slow for practical applications. Over the last years many improvement strategies have been developed to speed up backpropagation. It's very difficult to compare these different techniques, because most of them have been tested on very special data sets. The reported results are based on some kind of tiny and artificial training sets like XOR, encoder or decoder. It's very doubtful if this results hold for a much more complicate practical application. In these report an overview of many different speedup techniques is given. All of them are tested on a very hard practical classification task, which consists of a big medical data set. As you will see many of these optimized algorithms fail in learning the data set. 2 Application

Cite

CITATION STYLE

APA

Di Claudio, E. D., Parisi, R., & Orlandi, G. (1993). LS-Backpropagation Algorithm for Training Multilayer Perceptrons. In ICANN ’93 (pp. 768–771). Springer London. https://doi.org/10.1007/978-1-4471-2063-6_213

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free