Size/Accuracy Trade-Off in Convolutional Neural Networks: An Evolutionary Approach

  • Cetto T
  • Byrne J
  • Xu X
  • et al.
N/ACitations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In recent years, the shift from hand-crafted design of Convo-lutional Neural Networks (CNNs) to an automatic approach (AutoML) has garnered much attention. However, most of this work has been concentrated on generating state of the art (SOTA) architectures that set new standards of accuracy. In this paper, we use the NSGA-II algorithm for multi-objective optimization to optimize the size/accuracy trade-off in CNNs. This approach is inspired by the need for simple, effective, and mobile-sized architectures which can easily be retrained on any datasets. This optimization is carried out using a Grammatical Evolution approach, which, implemented alongside NSGA-II, automatically generates valid network topologies which can best optimize the size/accuracy trade-off. Furthermore, we investigate how the algorithm responds to an increase in the size of the search space, moving from strictly topology optimization (number of layers, size of filter, number of kernels,etc..) and then expanding the search space to include possible variations in other hyper-parameters such as the type of optimizer, dropout rate, batch size, or learning rate, amongst others.

Cite

CITATION STYLE

APA

Cetto, T., Byrne, J., Xu, X., & Moloney, D. (2020). Size/Accuracy Trade-Off in Convolutional Neural Networks: An Evolutionary Approach (pp. 17–26). https://doi.org/10.1007/978-3-030-16841-4_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free