Influence of Digital Fluctuations on Behavior of Neural Networks

  • Netay I
N/ACitations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper deals with effect of digital noise to numerical stability of neural networks. Digital noise arises from the inexactness of floating point values operations. Accumulated errors finally lead to the loss of significance. Experiments show that more redundant networks have higher noise influence. This effect is tested in both model and real world samples. As a result, one should exclude all the networks results from the beginning of fluctuations. Results of experiments allow us to hypothesize that minimal values of loss function preserving significance were achieved for the networks of size close to the complexity of the dataset. So, it is a reason to choose sizes of network layers in accordance with complexity of particular datasets and not universally for an architecture and general problem statement without relation to data. In the case of fine tuning this suggests that pruning of network layers can improve result accuracy and reliability of prediction due to decrease of numerical noise influence. Results of this article are based on analysis of numerical experiments with train of more than 50000 neural networks for thousands epochs for each network. Almost all the networks begin to fluctuate.

Cite

CITATION STYLE

APA

Netay, I. V. (2022). Influence of Digital Fluctuations on Behavior of Neural Networks. Indian Journal of Artificial Intelligence and Neural Networking, 3(1), 1–7. https://doi.org/10.54105/ijainn.a1061.123122

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free