A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function

7Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The paper is aimed to employ a modified secant equation in the framework of the hybrid conjugate gradient method based on Andrei’s approach to solve large-scale unconstrained optimization problems. The CG parameter in the mentioned hybrid CG method is a convex combination of CG parameters corresponding to the Hestenes–Stiefel and Dai–Yuan algorithms. The main feature of these hybrid methods is that the search direction is the Newton direction. The modified secant equation is derived by means of the fifth-order tensor model to improve the curvature information of the objective function. Also, to achieve convergence for general function, the revised version of the method based on the linear combination of the mentioned secant equation and Li and Fukushima’s modified secant equation is suggested. Under proper conditions, globally convergence properties of the new hybrid CG algorithm even without convexity assumption on the objective function is studied. Numerical experiments on a set of test problems of the CUTEr collection are done; they demonstrate the practical effectiveness of the proposed hybrid conjugate gradient algorithm.

Cite

CITATION STYLE

APA

Khoshgam, Z., & Ashrafi, A. (2019). A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function. Computational and Applied Mathematics, 38(4). https://doi.org/10.1007/s40314-019-0973-7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free