Soft margin training for associative memories implemented by recurrent neural networks

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, the authors discuss a new synthesis approach to train associative memories, based on recurrent neural networks (RNNs). They propose to use soft margin training for associative memories, which is efficient when training patterns are not all linearly separable. On the basis of the soft margin algorithm used to train support vector machines (SVMs), the new algorithm is developed in order to improve the obtained results via optimal training algorithm also innovated by the authors, which are not fully satisfactory due to that some times the training patterns are not all linearly separable. This new algorithm is used for the synthesis of an associative memory considering the design based on a RNN with the connection matrix having upper bounds on the diagonal elements to reduce the total number of spurious memory. The scheme is evaluated via a full scale simulator to diagnose the main faults occurred in fossil electric power plants. © 2007 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Ruz-Hernandez, J. A., Sanchez, E. N., & Suarez, D. A. (2007). Soft margin training for associative memories implemented by recurrent neural networks. Advances in Soft Computing, 41, 205–214. https://doi.org/10.1007/978-3-540-72432-2_21

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free