Synergies between Intrinsic and Synaptic Plasticity Based on Information Theoretic Learning

10Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.

Abstract

In experimental and theoretical neuroscience, synaptic plasticity has dominated the area of neural plasticity for a very long time. Recently, neuronal intrinsic plasticity (IP) has become a hot topic in this area. IP is sometimes thought to be an information-maximization mechanism. However, it is still unclear how IP affects the performance of artificial neural networks in supervised learning applications. From an information-theoretical perspective, the error-entropy minimization (MEE) algorithm has newly been proposed as an efficient training method. In this study, we propose a synergistic learning algorithm combining the MEE algorithm as the synaptic plasticity rule and an information-maximization algorithm as the intrinsic plasticity rule. We consider both feedforward and recurrent neural networks and study the interactions between intrinsic and synaptic plasticity. Simulations indicate that the intrinsic plasticity rule can improve the performance of artificial neural networks trained by the MEE algorithm. © 2013 Li, Li.

Cite

CITATION STYLE

APA

Li, Y., & Li, C. (2013). Synergies between Intrinsic and Synaptic Plasticity Based on Information Theoretic Learning. PLoS ONE, 8(5). https://doi.org/10.1371/journal.pone.0062894

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free