Backpropagation algorithm with fractional derivatives

  • Gomolka Z
N/ACitations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

The paper presents a model of a neural network with a novel backpropagation rule, which uses a fractional order derivative mechanism. Using the Grunwald Letnikow definition of the discrete approximation of the fractional derivative, the author proposed the smooth modeling of the transition functions of a single neuron. On this basis, a new concept of a modified backpropagation algorithm was proposed that uses the fractional derivative mechanism both for modeling the dynamics of individual neurons and for minimizing the error function. The description of the signal flow through the neural network and the mechanism of smooth shape control of the activation functions of individual neurons are given. The model of minimization of the error function is presented, which takes into account the possibility of changes in the characteristics of individual neurons. For the proposed network model, example courses of the learning processes are presented, which prove the convergence of the learning process for different shapes of the transition function. The proposed algorithm allows the learning process to be conducted with a smooth modification of the shape of the transition function without the need for modifying the IT model of the designed neural network. The proposed network model is a new tool that can be used in signal classification tasks.

Cite

CITATION STYLE

APA

Gomolka, Z. (2018). Backpropagation algorithm with fractional derivatives. ITM Web of Conferences, 21, 00004. https://doi.org/10.1051/itmconf/20182100004

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free