Rectified Exponential Units for Convolutional Neural Networks

34Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Rectified linear unit (ReLU) plays an important role in today's convolutional neural networks (CNNs). In this paper, we propose a novel activation function called Rectified Exponential Unit (REU). Inspired by two recently proposed activation functions: Exponential Linear Unit (ELU) and Swish, the REU is designed by introducing the advantage of flexible exponent and multiplication function form. Moreover, we propose the Parametric REU (PREU) to increase the expressive power of the REU. The experiments with three classical CNN architectures, LeNet-5, Network in Network, and Residual Network (ResNet) on scale-various benchmarks including Fashion-MNIST, CIFAR10, CIFAR100, and Tiny ImageNet demonstrate that REU and PREU achieve improvement compared with other activation functions. Our results show that our REU has relative error improvements over ReLU of 7.74% and 6.08% on CIFAR-10 and 100 with the ResNet, the improvements of PREU is 9.24% and 9.32%. Finally, we use the different PREU variants in the Residual unit to achieve more stable results.

Cite

CITATION STYLE

APA

Ying, Y., Su, J., Shan, P., Miao, L., Wang, X., & Peng, S. (2019). Rectified Exponential Units for Convolutional Neural Networks. IEEE Access, 7, 101633–101640. https://doi.org/10.1109/ACCESS.2019.2928442

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free