The Effect of Batch Normalization on Noise Resistant Property of Deep Learning Models

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The fast execution speed and energy efficiency of analog hardware have made them a strong contender for deploying deep learning models at the edge. However, there are concerns about the presence of analog noise which causes changes to the models' weight, leading to performance degradation of deep learning models, despite their inherent noise-resistant characteristics. The effect of the popular batch normalization layer (BatchNorm) on the noise-resistant ability of deep learning models is investigated in this work. This systematic study has been carried out by first training different models with and without the BatchNorm layer on the CIFAR10 and the CIFAR100 datasets. The weights of the resulting models are then injected with analog noise, and the performance of the models on the test dataset is obtained and compared. The results show that the presence of the BatchNorm layer negatively impacts the noise-resistant property of deep learning models, i.e., ResNet44 and VGG16 models with BatchNorm layers trained with the CIFAR10 dataset have an average normalized inference accuracy of 41.32% and 10.75% respectively compared to 91.95% and 93.80% obtained for same ResNet44 and VGG16 model without the BatchNorm layer respectively. Furthermore, the impact of the BatchNorm layer also grows with the increase of the number of BatchNorm layers.

Cite

CITATION STYLE

APA

Fagbohungbe, O., & Qian, L. (2022). The Effect of Batch Normalization on Noise Resistant Property of Deep Learning Models. IEEE Access, 10, 127728–127741. https://doi.org/10.1109/ACCESS.2022.3206958

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free