A fast evolutionary learning to optimize CNN

9Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Deep neural networks (DNNs) show great performance in lots of applications. Convolutional neural network (CNN) is one of the classic DNNs, and various modified CNNs have been brought up, such as DenseNet, GoogleNet, ResNet, etc. For diverse tasks, a unique structure of CNN may show its advantage. However, how to design an effective CNN model for a practical task is a puzzle. In this paper, we model the architecture optimization of CNN as an optimization problem and design a Genetic network programming based Fast evolutionary learning (GNP-FEL) to optimize CNN. GNP-FEL contains three main ideas: First GNP is adopted to optimize CNN architecture and hyperparameters, which can build diverse network structures and make network parameters self-evolve; Second multi-objective optimization is designed by balancing both CNN model efficiency and structure compactness; Last a novel incremental training method is proposed to train offspring CNN models in GNP, which is capable of reducing time complexity sharply. Experiments have validated that GNP-FEL can quickly evolve a CNN classifier with a sufficiently compact architecture. And the classifier has a comparable classification effect to state-of-the-art CNN model.

Cite

CITATION STYLE

APA

Chen, J., Lin, X., Gao, S., Xiong, H., Zhang, L., Liu, Y., & Xuan, Q. (2020). A fast evolutionary learning to optimize CNN. Chinese Journal of Electronics, 29(6), 1061–1073. https://doi.org/10.1049/cje.2020.09.007

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free