Hebbian Learning Meets Deep Convolutional Neural Networks

  • Amato Giuseppe and Carrara F
N/ACitations
Citations of this article
1Readers
Mendeley users who have this article in their library.

Abstract

Neural networks are said to be biologically inspired since they mimic the behavior of real neurons. However, several processes in state-of-the-art neural networks, including Deep Convolutional Neural Networks (DCNN), are far from the ones found in animal brains. One relevant difference is the training process. In state-of-the-art artificial neural networks, the training process is based on backpropagation and Stochastic Gradient Descent (SGD) optimization. However, studies in neuroscience strongly suggest that this kind of processes does not occur in the biological brain. Rather, learning methods based on Spike-Timing-Dependent Plasticity (STDP) or the Hebbian learning rule seem to be more plausible, according to neuroscientists. In this paper, we investigate the use of the Hebbian learning rule when training Deep Neural Networks for image classification by proposing a novel weight update rule for shared kernels in DCNNs. We perform experiments using the CIFAR-10 dataset in which we employ Hebbian learning, along with SGD, to train parts of the model or whole networks for the task of image classification, and we discuss their performance thoroughly considering both effectiveness and efficiency aspects.

Cite

CITATION STYLE

APA

Amato Giuseppe and Carrara, F. and F. F. and G. C. and L. G. (2019). Hebbian Learning Meets Deep Convolutional Neural Networks. In S. and S. C. and L. O. and M. S. and S. N. Ricci Elisa and Rota Bulò (Ed.), Image Analysis and Processing – ICIAP 2019 (pp. 324–334). Springer International Publishing.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free