Improved Convolutional Neural Network Based on Fast Exponentially Linear Unit Activation Function

73Citations
Citations of this article
60Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The activation functions play increasingly important roles in deep convolutional neural networks. The traditional activation functions have some problems such as gradient disappearance, neuron death and output offset, and so on. To solve these problems, we propose a new activation function in this paper, Fast Exponentially Linear Unit (FELU), aiming to speed up exponential linear calculations and reduce the time of network running. FELU has the advantages of Rectified Linear Unit (RELU) and Exponential Linear Unit (ELU), leading to have better classification accuracy and faster calculation speed. We test five traditional activation functions such as ReLU, ELU, SLU, MPELU, TReLU, and our new activation function on the cifar10, cifar100 and GTSRB data sets. Experiments show that the proposed activation function FELU not only improves the speed of the exponential calculation, reducing the time of convolutional neural network running, but also effectively enhances the noise robustness of network to improve the accuracy of classification.

Cite

CITATION STYLE

APA

Qiumei, Z., Dan, T., & Fenghua, W. (2019). Improved Convolutional Neural Network Based on Fast Exponentially Linear Unit Activation Function. IEEE Access, 7, 151359–151367. https://doi.org/10.1109/ACCESS.2019.2948112

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free