Two New Conjugate Gradient Methods for Unconstrained Optimization

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.

Cite

CITATION STYLE

APA

Liu, M., Ma, G., & Yin, J. (2020). Two New Conjugate Gradient Methods for Unconstrained Optimization. Complexity, 2020. https://doi.org/10.1155/2020/9720653

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free