MB-CNN: Memristive binary convolutional neural networks for embedded mobile devices

15Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

Abstract

Applications of neural networks have gained significant importance in embedded mobile devices and Internet of Things (IoT) nodes. In particular, convolutional neural networks have emerged as one of the most powerful techniques in computer vision, speech recognition, and AI applications that can improve the mobile user experience. However, satisfying all power and performance requirements of such low power devices is a significant challenge. Recent work has shown that binarizing a neural network can significantly improve the memory requirements of mobile devices at the cost of minor loss in accuracy. This paper proposes MB-CNN, a memristive accelerator for binary convolutional neural networks that perform XNOR convolution in-situ novel 2R memristive data blocks to improve power, performance, and memory requirements of embedded mobile devices. The proposed accelerator achieves at least 13.26×, 5.91×, and 3.18× improvements in the system energy efficiency (computed by energy × delay) over the state-of-the-art software, GPU, and PIM architectures, respectively. The solution architecture which integrates CPU, GPU and MB-CNN outperforms every other configuration in terms of system energy and execution time.

Cite

CITATION STYLE

APA

Chowdhury, A. P., Kulkarni, P., & Bojnordi, M. N. (2018). MB-CNN: Memristive binary convolutional neural networks for embedded mobile devices. Journal of Low Power Electronics and Applications, 8(4). https://doi.org/10.3390/jlpea8040038

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free