Convergence analysis of pso for hyper-parameter selection in deep neural networks

9Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep Neural Networks (DNNs) have gained enormous research attention since they consistently outperform other state-of-the-art methods in a plethora of machine learning tasks. However, their performance strongly depends on the DNN hyper-parameters which are commonly tuned by experienced practitioners. Recently, we introduced Particle Swarm Optimization (PSO) and parallel PSO techniques to automate this process. In this work, we theoretically and experimentally investigate the convergence capabilities of these algorithms. The experiments were performed for several DNN architectures (both gradually augmented and hand-crafted by a human) using two challenging multi-class benchmark datasets—MNIST and CIFAR-10.

Cite

CITATION STYLE

APA

Nalepa, J., & Lorenzo, P. R. (2018). Convergence analysis of pso for hyper-parameter selection in deep neural networks. In Lecture Notes on Data Engineering and Communications Technologies (Vol. 13, pp. 284–295). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-319-69835-9_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free