A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function

7Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

The paper is aimed to employ a modified secant equation in the framework of the hybrid conjugate gradient method based on Andrei’s approach to solve large-scale unconstrained optimization problems. The CG parameter in the mentioned hybrid CG method is a convex combination of CG parameters corresponding to the Hestenes–Stiefel and Dai–Yuan algorithms. The main feature of these hybrid methods is that the search direction is the Newton direction. The modified secant equation is derived by means of the fifth-order tensor model to improve the curvature information of the objective function. Also, to achieve convergence for general function, the revised version of the method based on the linear combination of the mentioned secant equation and Li and Fukushima’s modified secant equation is suggested. Under proper conditions, globally convergence properties of the new hybrid CG algorithm even without convexity assumption on the objective function is studied. Numerical experiments on a set of test problems of the CUTEr collection are done; they demonstrate the practical effectiveness of the proposed hybrid conjugate gradient algorithm.

References Powered by Scopus

Benchmarking optimization software with performance profiles

3545Citations
N/AReaders
Get full text

A nonlinear conjugate gradient method with a strong global convergence property

1178Citations
N/AReaders
Get full text

The conjugate gradient method in extremal problems

1010Citations
N/AReaders
Get full text

Cited by Powered by Scopus

A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems

33Citations
N/AReaders
Get full text

Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications

22Citations
N/AReaders
Get full text

An improved Polak–Ribière–Polyak conjugate gradient method with an efficient restart direction

21Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Khoshgam, Z., & Ashrafi, A. (2019). A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function. Computational and Applied Mathematics, 38(4). https://doi.org/10.1007/s40314-019-0973-7

Readers over time

‘19‘20‘22‘23‘2400.751.52.253

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 2

50%

Professor / Associate Prof. 1

25%

Lecturer / Post doc 1

25%

Readers' Discipline

Tooltip

Mathematics 4

100%

Save time finding and organizing research with Mendeley

Sign up for free
0