Optimizing hyperparameters in Convolutional Neural Network (CNN) is a tedious problem for many researchers and practitioners. To get hyperparameters with better performance, experts are required to configure a set of hyperparameter choices manually. The best results of this manual configuration are thereafter modeled and implemented in CNN. However, different datasets require different model or combination of hyperparameters, which can be cumbersome and tedious. To address this, several works have been proposed such as grid search which is limited to low dimensional space, and tails which use random selection. Also, optimization methods such as evolutionary algorithms and Bayesian have been tested on MNIST datasets, which is less costly and require fewer hyperparameters than CIFAR-10 datasets. In this paper, the authors investigate the hyperparameter search methods on CIFAR-10 datasets. During the investigation with various optimization methods, performances in terms of accuracy are tested and recorded. Although there is no significant difference between propose approach and the state-of-the-art on CIFAR-10 datasets, however, the actual potency lies in the hybridization of genetic algorithms with local search method in optimizing both network structures and network training which is yet to be reported to the best of author knowledge.
Aszemi, N. M., & Dominic, P. D. D. (2019). Hyperparameter optimization in convolutional neural network using genetic algorithms. International Journal of Advanced Computer Science and Applications, 10(6), 269–278. https://doi.org/10.14569/ijacsa.2019.0100638