New critical analysis on global convergence of recurrent neural networks with projection mappings

N/ACitations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we present the general analysis of global convergence for the recurrent neural networks (RNNs) with projection mappings in the critical case that M(L, Γ), a matrix related with the weight matrix W and the activation mapping of the networks, is nonnegative for a positive diagonal matrix Γ. In contrast to the existing conclusion such as in [1], the present critical stability results do not require the condition that ΓW must be symmetric and can be applied to the general projection mappings other than nearest point projection mappings. An example has also been shown that the theoretical results obtained in the present paper have explicitly practical application. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Qiao, C., & Xu, Z. B. (2007). New critical analysis on global convergence of recurrent neural networks with projection mappings. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4493 LNCS, pp. 131–139). Springer Verlag. https://doi.org/10.1007/978-3-540-72395-0_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free