A new conjugate gradient method for acceleration of gradient descent algorithms

3Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

An accelerated of the steepest descent method for solving unconstrained optimization problems is presented. which propose a fundamentally different conjugate gradient method, in which the well-known parameter βk is computed by an new formula. Under common assumptions, by using a modified Wolfe line search, descent property and global convergence results were established for the new method. Experimental results provide evidence that our proposed method is in general superior to the classical steepest descent method and has a potential to significantly enhance the computational efficiency and robustness of the training process.

Cite

CITATION STYLE

APA

Rahali, N., Belloufi, M., & Benzine, R. (2021). A new conjugate gradient method for acceleration of gradient descent algorithms. Moroccan Journal of Pure and Applied Analysis, 7(1), 1–11. https://doi.org/10.2478/mjpaa-2021-0001

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free