Evolution of activation functions for deep learning-based image classification

6Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Activation functions (AFs) play a pivotal role in the performance of neural networks. The Rectified Linear Unit (ReLU) is currently the most commonly used AF. Several replacements to ReLU have been suggested but improvements have proven inconsistent. Some AFs exhibit better performance for specific tasks, but it is hard to know a priori how to select the appropriate one(s). Studying both standard fully connected neural networks (FCNs) and convolutional neural networks (CNNs), we propose a novel, three-population, co-evolutionary algorithm to evolve AFs, and compare it to four other methods, both evolutionary and non-evolutionary. Tested on four datasets - -MNIST, FashionMNIST, KMNIST, and USPS - -coevolution proves to be a performant algorithm for finding good AFs and AF architectures.

Cite

CITATION STYLE

APA

Lapid, R., & Sipper, M. (2022). Evolution of activation functions for deep learning-based image classification. In GECCO 2022 Companion - Proceedings of the 2022 Genetic and Evolutionary Computation Conference (pp. 2113–2121). Association for Computing Machinery, Inc. https://doi.org/10.1145/3520304.3533949

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free