Analogue neuro-memristive convolutional dropout nets: Convolutional Dropout Nets

12Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Randomly switching neurons ON/OFF while training and inference process is an interesting characteristic of biological neural networks, that potentially results in inherent adaptability and creativity expressed by human mind. Dropouts inspire from this random switching behaviour and in the artificial neural network they are used as a regularization techniques to reduce the impact of over-fitting during the training. The energy-efficient digital implementations of convolutional neural networks (CNN) have been on the rise for edge computing IoT applications. Pruning larger networks and optimization for performance accuracy has been the main direction of work in this field. As opposed to this approach, we propose to build a near-sensor analogue CNN with high-density memristor crossbar arrays. Since several active elements such as amplifiers are used in analogue designs, energy efficiency becomes a main challenge. To address this, we extend the idea of using dropouts in training to also the inference stage. The CNN implementations require a subsampling layer, which is implemented as a mean pooling layer in the design to ensure lower energy consumption. Along with the dropouts, we also investigate the effect of non-idealities of memristor and that of the network.

Cite

CITATION STYLE

APA

Krestinskaya, O., & James, A. P. (2020). Analogue neuro-memristive convolutional dropout nets: Convolutional Dropout Nets. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 476(2242). https://doi.org/10.1098/rspa.2020.0210

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free