Nonlinear conjugate gradient methods with sufficient descent condition for large-Scale unconstrained optimization

35Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method. © 2009 Jianguo Zhang et al.

Cite

CITATION STYLE

APA

Zhang, J., Xiao, Y., & Wei, Z. (2009). Nonlinear conjugate gradient methods with sufficient descent condition for large-Scale unconstrained optimization. Mathematical Problems in Engineering, 2009. https://doi.org/10.1155/2009/243290

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free