In recent years, the shift from hand-crafted design of Convo-lutional Neural Networks (CNNs) to an automatic approach (AutoML) has garnered much attention. However, most of this work has been concentrated on generating state of the art (SOTA) architectures that set new standards of accuracy. In this paper, we use the NSGA-II algorithm for multi-objective optimization to optimize the size/accuracy trade-off in CNNs. This approach is inspired by the need for simple, effective, and mobile-sized architectures which can easily be retrained on any datasets. This optimization is carried out using a Grammatical Evolution approach, which, implemented alongside NSGA-II, automatically generates valid network topologies which can best optimize the size/accuracy trade-off. Furthermore, we investigate how the algorithm responds to an increase in the size of the search space, moving from strictly topology optimization (number of layers, size of filter, number of kernels,etc..) and then expanding the search space to include possible variations in other hyper-parameters such as the type of optimizer, dropout rate, batch size, or learning rate, amongst others.
CITATION STYLE
Cetto, T., Byrne, J., Xu, X., & Moloney, D. (2020). Size/Accuracy Trade-Off in Convolutional Neural Networks: An Evolutionary Approach (pp. 17–26). https://doi.org/10.1007/978-3-030-16841-4_3
Mendeley helps you to discover research relevant for your work.