Long Short-Term Memory Spiking Networks and Their Applications

39Citations
Citations of this article
57Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recent advances in event-based neuromorphic systems have resulted in significant interest in the use and development of spiking neural networks (SNNs). However, the non-differentiable nature of spiking neurons makes SNNs incompatible with conventional backpropagation techniques. In spite of the significant progress made in training conventional deep neural networks (DNNs), training methods for SNNs still remain relatively poorly understood. In this paper, we present a novel framework for training recurrent SNNs. Analogous to the benefits presented by recurrent neural networks (RNNs) in learning time series models within DNNs, we develop SNNs based on long short-term memory (LSTM) networks. We show that LSTM spiking networks learn the timing of the spikes and temporal dependencies. We also develop a methodology for error backpropagation within LSTM-based SNNs. The developed architecture and method for backpropagation within LSTM-based SNNs enable them to learn long-term dependencies with comparable results to conventional LSTMs. Code is available on github; http://github.com/AliLotfi92/SNNLSTM

Cite

CITATION STYLE

APA

Lotfi Rezaabad, A., & Vishwanath, S. (2020). Long Short-Term Memory Spiking Networks and Their Applications. In ACM International Conference Proceeding Series. Association for Computing Machinery. https://doi.org/10.1145/3407197.3407211

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free