Three global exponential convergence results of the GPNN for solving generalized linear variational inequalities

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The general projection neural network (GPNN) is a versatile recurrent neural network model capable of solving a variety of optimization problems and variational inequalities. In a recent article [IEEE Trans. Neural Netw., 18(6), 1697-1708, 2007], the linear case of GPNN was studied extensively from the viewpoint of stability analysis, and it was utilized to solve the generalized linear variational inequality with various types of constraints. In the present paper we supplement three global exponential convergence results for the GPNN for solving these problems. The first one is different from those shown in the original article, and the other two are improved versions of two results in that article. The validity of the new results are demonstrated by numerical examples. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Hu, X., Zeng, Z., & Zhang, B. (2008). Three global exponential convergence results of the GPNN for solving generalized linear variational inequalities. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5263 LNCS, pp. 309–318). Springer Verlag. https://doi.org/10.1007/978-3-540-87732-5_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free