Competitive maximization of neuronal activity in convolutional recurrent spiking neural networks

2Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Spiking neural networks (SNNs) are the promising algorithm for specific neurochip hardware real-time solutions. SNNs are believed to be highly energy and computationally efficient. We focus on developing local learning rules that are capable to provide both supervised and unsupervised learning. We suppose that each neuron in a biological neural network tends to maximize its activity in competition with other neurons. This principle was put at the basis of SNN learning algorithm called FEELING. Here we introduce efficient Convolutional Recurrent Spiking Neural Network architecture that uses FEELING rules and provides better results than fully connected SNN on MNIST benchmark having 55 times less learnable weight parameters.

Cite

CITATION STYLE

APA

Nekhaev, D., & Demin, V. (2020). Competitive maximization of neuronal activity in convolutional recurrent spiking neural networks. In Studies in Computational Intelligence (Vol. 856, pp. 255–262). Springer Verlag. https://doi.org/10.1007/978-3-030-30425-6_30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free