MLP+H: A hybrid neural architecture formed by the interaction of Hopfield and Multi-Layer Perceptron neural networks

3Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper addresses the design and experimental characterization of a novel hybrid neural network, in which two distinct classical architectures interact: the Hopfield neural network and the Multi-Layer Perception. This hybrid neural system, named MLP+H (from MLP + Hopfield), presents a better performance than each one of the two classical architectures when considered in separate. In addition, it has the ability to deal with different classes of data than that normally allowed by the two conventional architectures. For example, while the Hopfield networks deal with binary patterns and the MLPs with information that always have some analog characteristics (due to the continuous nature of the MLP nodes), the MLP+H deals with analog inputs and purely digital outputs. Moreover, the MLP+H allows reduced training times when compared with the MLP architecture, also presenting compactness and flexibility in dealing with different applications, as implementation of anti-noise filters, an application currently under study. © Springer-Verlag Berlin Heidelberg 2003.

Cite

CITATION STYLE

APA

Oliveira, C. S., & Del Moral Hernandez, E. (2003). MLP+H: A hybrid neural architecture formed by the interaction of Hopfield and Multi-Layer Perceptron neural networks. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2686, 166–173. https://doi.org/10.1007/3-540-44868-3_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free