Analysis of the Variance Reduction in SVRG and a New Acceleration Method

6Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Stochastic gradient descent is a popular method in large-scale optimization for machine learning but suffers from a slow convergence. In recent years, stochastic variance reduced gradient (SVRG) is proposed to remedy this problem. Although many variants of SVRG have been studied, the analysis of variance has not been thoroughly discussed. In this paper, we propose a general framework denoted by epoch-update-indentification (EUI), which is an abstraction of the existing variants of SVRG. Under this framework i.e., EUI, we then provide a general analysis of the variance reduction technique from a new perspective. Additionally, those previous variants of SVRG have to keep a snapshot of the full gradient for each epoch, which is computationally expensive. In this paper, we also propose a new variant of SVRG named sampleVR which estimates the snapshot of the full gradient by using a sampling strategy, thus leading to decrease the gradient complexity significantly. Both the theoretical analysis and extensive empirical studies show that sampleVR achieves a good tradeoff between convergence performance and gradient complexity, and thus makes the training loss converge faster than its counterparts.

Cite

CITATION STYLE

APA

Min, E., Long, J., & Cui, J. (2018). Analysis of the Variance Reduction in SVRG and a New Acceleration Method. IEEE Access, 6, 16165–16175. https://doi.org/10.1109/ACCESS.2018.2814212

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free