Improving Convolutional Neural Network Expression via Difference Exponentially Linear Units

2Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Convolutional Neural Network (CNN) has been applied to various tasks with great success. Adding an activation function is an important way to introduce nonlinearity into convolutional neural networks. The commonly used activation functions mostly use different forms of negative feedback for the negative input, however, some researchers have recently proposed positive feedback methods for the negative input such as Concatenated Rectified Linear Units (CReLU) and Linearly Scaled Hyperbolic Tangent (LiSHT) and achieved better performance. To explore this idea further more, we propose a new activation function called Difference Exponentially Linear Unit (DELU). The proposed DELU activation function can optionally provide positive and negative feedback for different values of negative inputs. Our experimental results on the commonly used datasets such as Fashion Mnist, CIFAR10, and Imagenet show that DELU outperforms other six activation functions, Leaky ReLU, ReLU, ELU, SELU, Swish, SERLU.

Cite

CITATION STYLE

APA

Hu, Z., Huang, H., Ran, Q., & Yuan, M. (2020). Improving Convolutional Neural Network Expression via Difference Exponentially Linear Units. In Journal of Physics: Conference Series (Vol. 1651). IOP Publishing Ltd. https://doi.org/10.1088/1742-6596/1651/1/012163

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free