Exact and inexact subsampled Newton methods for optimization

97Citations
Citations of this article
58Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The paper studies the solution of stochastic optimization problems in which approximations to the gradient and Hessian are obtained through subsampling. We first consider Newton-like methods that employ these approximations and discuss how to coordinate the accuracy in the gradient and Hessian to yield a superlinear rate of convergence in expectation. The second part of the paper analyzes an inexact Newton method that solves linear systems approximately using the conjugate gradient (CG) method, and that samples the Hessian and not the gradient (the gradient is assumed to be exact). We provide a complexity analysis for this method based on the properties of the CG iteration and the quality of the Hessian approximation, and compare it with a method that employs a stochastic gradient iteration instead of the CG method. We report preliminary numerical results that illustrate the performance of inexact subsampled Newton methods on machine learning applications based on logistic regression.

Cite

CITATION STYLE

APA

Bollapragada, R., Byrd, R. H., & Nocedal, J. (2019). Exact and inexact subsampled Newton methods for optimization. IMA Journal of Numerical Analysis, 39(2), 545–548. https://doi.org/10.1093/imanum/dry009

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free