Reliable Binarized Neural Networks on Unreliable beyond Von-Neumann Architecture

12Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Specialized hardware accelerators beyond von-Neumann, that offer processing capability in where the data resides without moving it, become inevitable in data-centric computing. Emerging non-volatile memories, like Ferroelectric Field-Effect Transistor (FeFET), are able to build compact Logic-in-Memory (LiM). In this work, we investigate the probability of error (Perror) in FeFET-based XNOR LiM, demonstrating the new trade-off between the speed and reliability. Using our reliability model, we present how Binarized Neural Networks (BNNs) can be proactively trained in the presence of XNOR-induced errors towards obtaining robust BNNs at the design time. Furthermore, leveraging the trade-off between Perror and speed, we present a run-time adaptation technique, that selectively trades-off Perror and XNOR speed for every BNN layer. Our results demonstrate that when a small loss (e.g., 1%) in inference accuracy could be accepted, our design-time and run-time techniques provide error-resilient BNNs that exhibit 75% and 50% (FashionMNIST) and 38% and 24% (CIFAR10) XNOR speedups, respectively.

Cite

CITATION STYLE

APA

Yayla, M., Thomann, S., Buschjager, S., Morik, K., Chen, J. J., & Amrouch, H. (2022). Reliable Binarized Neural Networks on Unreliable beyond Von-Neumann Architecture. IEEE Transactions on Circuits and Systems I: Regular Papers, 69(6), 2516–2528. https://doi.org/10.1109/TCSI.2022.3156165

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free