Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method. © 2009 Jianguo Zhang et al.
CITATION STYLE
Zhang, J., Xiao, Y., & Wei, Z. (2009). Nonlinear conjugate gradient methods with sufficient descent condition for large-Scale unconstrained optimization. Mathematical Problems in Engineering, 2009. https://doi.org/10.1155/2009/243290
Mendeley helps you to discover research relevant for your work.