A comprehensive review of stability analysis of continuous-time recurrent neural networks

644Citations
Citations of this article
287Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Stability problems of continuous-time recurrent neural networks have been extensively studied, and many papers have been published in the literature. The purpose of this paper is to provide a comprehensive review of the research on stability of continuous-time recurrent neural networks, including Hopfield neural networks, Cohen-Grossberg neural networks, and related models. Since time delay is inevitable in practice, stability results of recurrent neural networks with different classes of time delays are reviewed in detail. For the case of delay-dependent stability, the results on how to deal with the constant/variable delay in recurrent neural networks are summarized. The relationship among stability results in different forms, such as algebraic inequality forms,\(M\)-matrix forms, linear matrix inequality forms, and Lyapunov diagonal stability forms, is discussed and compared. Some necessary and sufficient stability conditions for recurrent neural networks without time delays are also discussed. Concluding remarks and future directions of stability analysis of recurrent neural networks are given. © 2014 IEEE.

Cite

CITATION STYLE

APA

Zhang, H., Wang, Z., & Liu, D. (2014). A comprehensive review of stability analysis of continuous-time recurrent neural networks. IEEE Transactions on Neural Networks and Learning Systems, 25(7), 1229–1262. https://doi.org/10.1109/TNNLS.2014.2317880

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free