Abstract
In this paper a new variant of accelerated gradient descent is proposed. The proposed method does not require any information about the objective function, uses exact line search for the practical accelerations of convergence, converges according to the well-known lower bounds for both convex and non-convex objective functions and possesses primal-dual properties. We also provide a universal version of said method, which converges according to the known lower bounds for both smooth and non-smooth problems.
Cite
CITATION STYLE
Guminov, S. V., Nesterov, Yu. E., Dvurechensky, P. E., & Gasnikov, A. V. (2019). Primal-dual accelerated gradient descent with line search for convex and nonconvex optimization problems. Доклады Академии Наук, 485(1), 15–18. https://doi.org/10.31857/s0869-5652485115-18
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.