Dynamic Neural Diversification: Path to Computationally Sustainable Neural Networks

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks, where now excessively large models are used. However, such models face several problems during the learning process, mainly due to the redundancy of the individual neurons, which results in sub-optimal accuracy or the need for additional training steps. Here, we explore the diversity of the neurons within the hidden layer during the learning process, and analyze how the diversity of the neurons affects predictions of the model. As following, we introduce several techniques to dynamically reinforce diversity between neurons during the training. These decorrelation techniques improve learning at early stages and occasionally help to overcome local minima faster. Additionally, we describe novel weight initialization method to obtain decorrelated, yet stochastic weight initialization for a fast and efficient neural network training. Decorrelated weight initialization in our case shows about 40% relative increase in test accuracy during the first 5 epochs.

Cite

CITATION STYLE

APA

Kovalenko, A., Kordík, P., & Friedjungová, M. (2021). Dynamic Neural Diversification: Path to Computationally Sustainable Neural Networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12892 LNCS, pp. 235–247). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-86340-1_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free