Improving the classification performance of optimal linear associative memory in the presence of outliers

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The optimal linear associative memory (OLAM) proposed by Kohonen and Ruohonen [16] is a classic neural network model widely used as a standalone pattern classifier or as a fundamental component of multilayer nonlinear classification approaches, such as the extreme learning machine (ELM) [10] and the echo-state network (ESN) [6]. In this paper, we develop an extension of OLAM which is robust to labeling errors (outliers) in the data set. The proposed model is robust to label noise not only near the class boundaries, but also far from the class boundaries which can result from mistakes in labelling or gross errors in measuring the input features. To deal with this problem, we propose the use of M-estimators, a parameter estimation framework widely used in robust regression, to compute the weight matrix operator, instead of using the ordinary least squares solution. We show the usefulness of the proposed classification approach through simulation results using synthetic and real-world data. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

De Paula Barros, A. L. B., & Barreto, G. A. (2013). Improving the classification performance of optimal linear associative memory in the presence of outliers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7902 LNCS, pp. 622–632). https://doi.org/10.1007/978-3-642-38679-4_63

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free