Batch normalization is an essential component of all state-of-the-art neural networks architectures. However, since it introduces many practical issues, much recent research has been devoted to designing normalization-free architectures. In this brief, we show that weights initialization is key to train ResNet-like normalization-free networks. In particular, we propose a slight modification to the summation operation of a block output to the skip-connection branch, so that the whole network is correctly initialized. We show that this modified architecture achieves competitive results on CIFAR-10, CIFAR-100 and ImageNet without further regularization nor algorithmic modifications.
CITATION STYLE
Civitelli, E., Sortino, A., Lapucci, M., Bagattini, F., & Galvan, G. (2023). A Robust Initialization of Residual Blocks for Effective ResNet Training Without Batch Normalization. IEEE Transactions on Neural Networks and Learning Systems. https://doi.org/10.1109/TNNLS.2023.3325541
Mendeley helps you to discover research relevant for your work.