Training the Hopfield Neural Network for Classification Using a STDP-Like Rule

3Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The backpropagation algorithm has played a critical role in training deep neural networks. Many studies suggest that the brain may implement a similar algorithm. But most of them require symmetric weights between neurons, which makes the models less biologically plausible. Inspired by some recent works by Bengio et al., we show that the well-known Hopfield neural network (HNN) can be trained in a biologically plausible way. The network can take hierarchical architectures and the weights between neurons are not necessarily symmetric. The network runs in two alternating phases. The weight change is proportional to the firing rate of the presynaptic neuron and the state (or membrane potential) change of the postsynaptic neuron between the two phases, which approximates a classical spike-timing-dependent-plasticity (STDP) rule. Several HNNs with one or two hidden layers are trained on the MNIST dataset and all of them converge to low training errors. These results further push our understanding of the brain mechanism for supervised learning.

Cite

CITATION STYLE

APA

Hu, X., & Wang, T. (2017). Training the Hopfield Neural Network for Classification Using a STDP-Like Rule. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10636 LNCS, pp. 737–744). Springer Verlag. https://doi.org/10.1007/978-3-319-70090-8_74

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free