A numerical study on efficiency and robustness of some conjugate gradient algorithms for large-scale unconstrained optimization

6Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

A numerical evaluation and comparisons using performance profiles of some representative conjugate gradient algorithms for solving a large variety of large-scale unconstrained optimization problems are carried on. In this intensive numerical study we selected eight known conjugate gradient algorithms: Hestenes and Stiefel (HS), Polak-Ribière-Polyak (PRP), CONMIN, ASCALCG, CG-DESCENT, AHYBRIDM, THREECG and DESCON. These algorithms are different in many respects. However, they have a lot of concepts in common, which give the numerical comparisons sense and confident expectations. The initial search direction in all algorithms is the negative gradient computed in the initial point and the step length is computed by the Wolfe line search conditions. Excepting CONMIN and CG-DESCENT, all the algorithms from this numerical study implement an acceleration scheme which modifies the step length in a multiplicative manner to improve the reduction of the functions values along the iterations. The numerical study is based on a set of 800 artificially large-scale unconstrained optimization test functions of different complexity and with different structures of their Hessian matrix. A detailed numerical evaluation based on performance profiles is applied to the comparisons of these algorithms showing that all of them are able to solve difficult large-scale unconstrained optimization problems. However, comparisons using only artificially test problems are weak and dependent on arbitrary choices concerning the stopping criteria of the algorithms and on decision of whether an algorithm found a solution or not. To get definitive conclusions using this sort of comparisons based only on artificially test problems is an illusion. However, using some real unconstrained optimization applications we can get a more confident conclusion about the efficiency and robustness of optimization algorithms considered in this numerical study.

Cite

CITATION STYLE

APA

Andrei, N. (2013). A numerical study on efficiency and robustness of some conjugate gradient algorithms for large-scale unconstrained optimization. Studies in Informatics and Control, 22(4), 259–284. https://doi.org/10.24846/v22i4y201302

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free