A five-term hybrid conjugate gradient method with global convergence and descent properties for unconstrained optimization problems

8Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

Background and Objective: The nonlinear conjugate gradient method is a recurrence technique for solving effectively large-scale unconstrained optimization problems. In this study, a new hybrid nonlinear conjugate gradient method that combines the features of 5 different conjugate gradient methods is proposed with the aim of combining the positive features of different non-hybrid methods. Methodology: The proposed method was able to generate descent directions independent of line search procedures. By making assumptions on the objective function, the global convergence of the method was established under the standard Wolfe line search conditions. Results: Preliminary results showed that the method is very competitive and promising when subjected to comparison with other non-hybrid methods based on numerical experiments with selected benchmark test functions. Conclusion: As a future study, the proposed method will be tested against recently proposed related methods.

Cite

CITATION STYLE

APA

Adeleke, O. J., & Osinuga, I. A. (2018). A five-term hybrid conjugate gradient method with global convergence and descent properties for unconstrained optimization problems. Asian Journal of Scientific Research, 11(2), 185–194. https://doi.org/10.3923/ajsr.2018.185.194

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free