New scaled sufficient descent conjugate gradient algorithm for solving unconstraint optimization problems

3Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

Problem statement: The scaled hybrid Conjugate Gradient (CG) algorithm which usually used for solving non-linear functions was presented and was compared with two standard well-Known NAG routines, yielding a new fast comparable algorithm. Approach: We proposed, a new hybrid technique based on the combination of two well-known scaled (CG) formulas for the quadratic model in unconstrained optimization using exact line searches. A global convergence result for the new technique was proved, when the Wolfe line search conditions were used. Results: Computational results, for a set consisting of 1915 combinations of (unconstrained optimization test problems/dimensions) were implemented in this research making a comparison between the new proposed algorithm and the other two similar algorithms in this field. Conclusion: Our numerical results showed that this new scaled hybrid CG-algorithm substantially outperforms Andrei-sufficient descent condition (CGSD) algorithm and the well-known Andrei standard sufficient descent condition from (ACGA) algorithm. © 2010 Science Publications.

Cite

CITATION STYLE

APA

Al-Bayati, A. Y., & Muhammad, R. S. (2010). New scaled sufficient descent conjugate gradient algorithm for solving unconstraint optimization problems. Journal of Computer Science, 6(5), 511–518. https://doi.org/10.3844/jcssp.2010.511.518

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free