Universal Approximation Property and Equivalence of Stochastic Computing-Based Neural Networks and Binary Neural Networks

  • Wang Y
  • Zhan Z
  • Zhao L
  • et al.
N/ACitations
Citations of this article
45Readers
Mendeley users who have this article in their library.

Abstract

Large-scale deep neural networks are both memory and computation-intensive, thereby posing stringent requirements on the computing platforms. Hardware accelerations of deep neural networks have been extensively investigated. Specific forms of binary neural networks (BNNs) and stochastic computing-based neural networks (SCNNs) are particularly appealing to hardware implementations since they can be implemented almost entirely with binary operations. Despite the obvious advantages in hardware implementation, these approximate computing techniques are questioned by researchers in terms of accuracy and universal applicability. Also it is important to understand the relative pros and cons of SCNNs and BNNs in theory and in actual hardware implementations. In order to address these concerns, in this paper we prove that the “ideal” SCNNs and BNNs satisfy the universal approximation property with probability 1 (due to the stochastic behavior), which is a new angle from the original approximation property. The proof is conducted by first proving the property for SCNNs from the strong law of large numbers, and then using SCNNs as a “bridge” to prove for BNNs. Besides the universal approximation property, we also derive an appropriate bound for bit length M in order to provide insights for the actual neural network implementations. Based on the universal approximation property, we further prove that SCNNs and BNNs exhibit the same energy complexity. In other words, they have the same asymptotic energy consumption with the growth of network size. We also provide a detailed analysis of the pros and cons of SCNNs and BNNs for hardware implementations and conclude that SCNNs are more suitable.

Cite

CITATION STYLE

APA

Wang, Y., Zhan, Z., Zhao, L., Tang, J., Wang, S., Li, J., … Lin, X. (2019). Universal Approximation Property and Equivalence of Stochastic Computing-Based Neural Networks and Binary Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 5369–5376. https://doi.org/10.1609/aaai.v33i01.33015369

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free