The kernel hopfield memory network

3Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The kernel theory drawn from the work on learning machines is applied to the Hopfield neural network. This provides a new insight into the workings of the neural network as associative memory. The kernel "trick" defines an embedding of memory patterns into (higher or infinite dimensional) memory feature vectors and the training of the network is carried out in this feature space. The generalization of the network by using the kernel theory improves its performance in three aspects. First, an adequate kernel selection enables the satisfaction of the condition that any set of memory patterns be attractors of the network dynamics. Second, the basins of attraction of the memory patterns are enhanced improving the recall capacity. Third, since the memory patterns are mapped into a higher dimensional feature space the memory capacity density is effectively increased. These aspects are experimentally demonstrated on sets of random memory patterns. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

García, C., & Moreno, J. A. (2004). The kernel hopfield memory network. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3305, 755–764. https://doi.org/10.1007/978-3-540-30479-1_78

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free