Spatially Arranged Sparse Recurrent Neural Networks for Energy Efficient Associative Memory

27Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The development of hardware neural networks, including neuromorphic hardware, has been accelerated over the past few years. However, it is challenging to operate very large-scale neural networks with low-power hardware devices, partly due to signal transmissions through a massive number of interconnections. Our aim is to deal with the issue of communication cost from an algorithmic viewpoint and study learning algorithms for energy-efficient information processing. Here, we consider two approaches to finding spatially arranged sparse recurrent neural networks with the high cost-performance ratio for associative memory. In the first approach following classical methods, we focus on sparse modular network structures inspired by biological brain networks and examine their storage capacity under an iterative learning rule. We show that incorporating long-range intermodule connections into purely modular networks can enhance the cost-performance ratio. In the second approach, we formulate for the first time an optimization problem where the network sparsity is maximized under the constraints imposed by a pattern embedding condition. We show that there is a tradeoff between the interconnection cost and the computational performance in the optimized networks. We demonstrate that the optimized networks can achieve a better cost-performance ratio compared with those considered in the first approach. We show the effectiveness of the optimization approach mainly using binary patterns and apply it also to gray-scale image restoration. Our results suggest that the presented approaches are useful in seeking more sparse and less costly connectivity of neural networks for the enhancement of energy efficiency in hardware neural networks.

Cite

CITATION STYLE

APA

Tanaka, G., Nakane, R., Takeuchi, T., Yamane, T., Nakano, D., Katayama, Y., & Hirose, A. (2020). Spatially Arranged Sparse Recurrent Neural Networks for Energy Efficient Associative Memory. IEEE Transactions on Neural Networks and Learning Systems, 31(1), 24–38. https://doi.org/10.1109/TNNLS.2019.2899344

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free