A Novel Activation Function of Deep Neural Network

  • Xiangyang L
  • Xing Q
  • Han Z
  • et al.
N/ACitations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

In deep neural networks, the activation function is an important component. The most popular activation functions at the moment are Sigmoid, Sin, rectified linear unit (ReLU), and some variants of ReLU. However, each of them has its own weakness. To improve the network fitting and generalization ability, a new activation function, TSin, is designed. The basic design idea for TSin function is to rotate the Sin function 45° counterclockwise and then finetune it to give it multiple better properties needed as an activation function, such as nonlinearity, global differentiability, unsaturated property, zero-centered property, monotonicity, quasi identity transformation property, and so on. The first is a theoretical derivation of TSin function by formulas. Then three experiments are designed for performance test. The results show that compared with some popular activation functions, TSin has advantages in terms of training stability, convergence speed, and convergence precision. The study of TSin not only provides a new choice of activation function in deep learning but also provides a new idea for activation function design in the future.

Cite

CITATION STYLE

APA

Xiangyang, L., Xing, Q., Han, Z., & Feng, C. (2023). A Novel Activation Function of Deep Neural Network. Scientific Programming, 2023, 1–12. https://doi.org/10.1155/2023/3873561

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free