Effects of Skip-Connection in ResNet and Batch-Normalization on Fisher Information Matrix

  • Furusho Y
  • Ikeda K
N/ACitations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep neural networks such as multi-layer perceptron (MLP) have intensively been studied and new techniques have been introduced for better generalization ability and faster convergence. One of the techniques is skip-connections between layers in the ResNet and another is the batch normalization (BN). To clarify effects of these techniques, we carried out the landscape analysis of the loss function for these networks. The landscape affects the convergence properties where the eigenvalues of the Fisher Information Matrix (FIM) plays an important role. Thus, we calculated the eigenvalues of the FIMs of the MLP, ResNet, and ResNet with BN by applying functional analysis to the networks with random weights, which of MLP was analyzed before in asymptotic case using the central limit theorem. Our results show that the MLP has eigenvalues that are independent of its depth, that the ResNet has eigenvalues that grow exponentially with its depth, and that the ResNet with BN has eigenvalues that grow sub-linear with its depth. These imply that the BN allows the ResNet to use larger learning rate and hence converges faster than the vanilla ResNet.

Cite

CITATION STYLE

APA

Furusho, Y., & Ikeda, K. (2020). Effects of Skip-Connection in ResNet and Batch-Normalization on Fisher Information Matrix (pp. 341–348). https://doi.org/10.1007/978-3-030-16841-4_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free