Improved adaline networks for robust pattern classification

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Adaline network [1] is a classic neural architecture whose learning rule is the famous least mean squares (LMS) algorithm (a.k.a. delta rule or Widrow-Hoff rule). It has been demonstrated that the LMS algorithm is optimal in H ∈∞∈ sense since it tolerates small (in energy) disturbances, such as measurement noise, parameter drifting and modelling errors [2,3]. Such optimality of the LMS algorithm, however, has been demonstrated for regression-like problems only, not for pattern classification. Bearing this in mind, we firstly show that the performances of the LMS algorithm and variants of it (including the recent Kernel LMS algorithm) in pattern classification tasks deteriorates considerably in the presence of labelling errors, and then introduce robust extensions of the Adaline network that can deal efficiently with such errors. Comprehensive computer simulations show that the proposed extension consistently outperforms the original version. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Mattos, C. L. C., Santos, J. D. A., & Barreto, G. A. (2014). Improved adaline networks for robust pattern classification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8681 LNCS, pp. 579–586). Springer Verlag. https://doi.org/10.1007/978-3-319-11179-7_73

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free