Efficient dendritic learning as an alternative to synaptic plasticity hypothesis

27Citations
Citations of this article
66Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Synaptic plasticity is a long-lasting core hypothesis of brain learning that suggests local adaptation between two connecting neurons and forms the foundation of machine learning. The main complexity of synaptic plasticity is that synapses and dendrites connect neurons in series and existing experiments cannot pinpoint the significant imprinted adaptation location. We showed efficient backpropagation and Hebbian learning on dendritic trees, inspired by experimental-based evidence, for sub-dendritic adaptation and its nonlinear amplification. It has proven to achieve success rates approaching unity for handwritten digits recognition, indicating realization of deep learning even by a single dendrite or neuron. Additionally, dendritic amplification practically generates an exponential number of input crosses, higher-order interactions, with the number of inputs, which enhance success rates. However, direct implementation of a large number of the cross weights and their exhaustive manipulation independently is beyond existing and anticipated computational power. Hence, a new type of nonlinear adaptive dendritic hardware for imitating dendritic learning and estimating the computational capability of the brain must be built.

References Powered by Scopus

Deep learning

63550Citations
N/AReaders
Get full text

Gradient-based learning applied to document recognition

44103Citations
N/AReaders
Get full text

Learning representations by back-propagating errors

20765Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Transcriptomics and biochemical evidence of trigonelline ameliorating learning and memory decline in the senescence-accelerated mouse prone 8 (SAMP8) model by suppressing proinflammatory cytokines and elevating neurotransmitter release

18Citations
N/AReaders
Get full text

Efficient shallow learning as an alternative to deep learning

16Citations
N/AReaders
Get full text

Artificial sensory system based on memristive devices

11Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Hodassman, S., Vardi, R., Tugendhaft, Y., Goldental, A., & Kanter, I. (2022). Efficient dendritic learning as an alternative to synaptic plasticity hypothesis. Scientific Reports, 12(1). https://doi.org/10.1038/s41598-022-10466-8

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 20

51%

Researcher 12

31%

Professor / Associate Prof. 5

13%

Lecturer / Post doc 2

5%

Readers' Discipline

Tooltip

Neuroscience 9

32%

Engineering 9

32%

Computer Science 5

18%

Physics and Astronomy 5

18%

Article Metrics

Tooltip
Mentions
Blog Mentions: 5
News Mentions: 19
References: 2
Social Media
Shares, Likes & Comments: 173

Save time finding and organizing research with Mendeley

Sign up for free