Interval type-2 fuzzy systems as deep neural network activation functions

3Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we propose a novel activation function, namely, Interval Type-2 (IT2) Fuzzy Rectifying Unit (FRU), to improve the performance of the Deep Neural Networks (DNNs). The IT2-FRU can generate linear or sophisticated activation functions by simply tuning the size of the footprint of uncertainty of the IT2 Fuzzy Sets. The novel IT2-FRU also alleviates vanishing gradient problem and has a fast convergence rate since it pushes the mean activation to zero by allowing the negative outputs. In order to test the performance of the IT2-FRU, comparative experimental studies are performed on the CIFAR-10 dataset. IT2-FRU is compared with widely used conventional activation functions. Experimental results show that IT2-FRU significantly speeds up the learning and has a superior performance compared to other handled activation functions.

Cite

CITATION STYLE

APA

Beke, A., & Kumbasar, T. (2020). Interval type-2 fuzzy systems as deep neural network activation functions. In Proceedings of the 11th Conference of the European Society for Fuzzy Logic and Technology, EUSFLAT 2019 (pp. 267–273). Atlantis Press. https://doi.org/10.2991/eusflat-19.2019.39

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free