On the Global Convergence of a Class of Functional Differential Equations with Applications in Neural Network Theory

106Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We study a system of retarded functional differential equations which generalise both the Hopfield neural network model as well as hybrid network models of the cellular neural network type. Our main results give sufficient conditions for the global asymptotic stability of such systems and are milder than previously known conditions for the hybrid models. When specialised to neural networks, our models allow us to consider several different types of activation functions, including piecewise linear sigmoids and unbounded activations as well as the usualC′-smooth sigmoids. These issues are vital in the applications. We also study neural network models with nonconstant delaysr(t). © 1999 Academic Press.

Cite

CITATION STYLE

APA

Joy, M. (1999). On the Global Convergence of a Class of Functional Differential Equations with Applications in Neural Network Theory. Journal of Mathematical Analysis and Applications, 232(1), 61–81. https://doi.org/10.1006/jmaa.1998.6240

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free