Lipschitz Continuity Retained Binary Neural Network

3Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Relying on the premise that the performance of a binary neural network can be largely restored with eliminated quantization error between full-precision weight vectors and their corresponding binary vectors, existing works of network binarization frequently adopt the idea of model robustness to reach the aforementioned objective. However, robustness remains to be an ill-defined concept without solid theoretical support. In this work, we introduce the Lipschitz continuity, a well-defined functional property, as the rigorous criteria to define the model robustness for BNN. We then propose to retain the Lipschitz continuity as a regularization term to improve the model robustness. Particularly, while the popular Lipschitz-involved regularization methods often collapse in BNN due to its extreme sparsity, we design the Retention Matrices to approximate spectral norms of the targeted weight matrices, which can be deployed as the approximation for the Lipschitz constant of BNNs without the exact Lipschitz constant computation (NP-hard). Our experiments prove that our BNN-specific regularization method can effectively enhance the robustness of BNN (testified on ImageNet-C), achieving SoTA on CIFAR10 and ImageNet. Our code is available at https://github.com/42Shawn/LCR_BNN.

Cite

CITATION STYLE

APA

Shang, Y., Xu, D., Duan, B., Zong, Z., Nie, L., & Yan, Y. (2022). Lipschitz Continuity Retained Binary Neural Network. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13671 LNCS, pp. 603–619). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-20083-0_36

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free