The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.
CITATION STYLE
Liu, M., Ma, G., & Yin, J. (2020). Two New Conjugate Gradient Methods for Unconstrained Optimization. Complexity, 2020. https://doi.org/10.1155/2020/9720653
Mendeley helps you to discover research relevant for your work.