Knowledge sharing for population based neural network training

2Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Finding good hyper-parameter settings to train neural networks is challenging, as the optimal settings can change during the training phase and also depend on random factors such as weight initialization or random batch sampling. Most state-of-the-art methods for the adaptation of these settings are either static (e.g. learning rate scheduler) or dynamic (e.g ADAM optimizer), but only change some of the hyper-parameters and do not deal with the initialization problem. In this paper, we extend the asynchronous evolutionary algorithm, population based training, which modifies all given hyper-parameters during training and inherits weights. We introduce a novel knowledge distilling scheme. Only the best individuals of the population are allowed to share part of their knowledge about the training data with the whole population. This embraces the idea of randomness between the models, rather than avoiding it, because the resulting diversity of models is important for the population’s evolution. Our experiments on MNIST, fashionMNIST, and EMNIST (MNIST split) with two classic model architectures show significant improvements to convergence and model accuracy compared to the original algorithm. In addition, we conduct experiments on EMNIST (balanced split) employing a ResNet and a WideResNet architecture to include complex architectures and data as well.

Cite

CITATION STYLE

APA

Oehmcke, S., & Kramer, O. (2018). Knowledge sharing for population based neural network training. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11117 LNAI, pp. 258–269). Springer Verlag. https://doi.org/10.1007/978-3-030-00111-7_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free