Accelerated variance reduced stochastic ADMM

33Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.

Abstract

Recently, many variance reduced stochastic alternating direction method of multipliers (ADMM) methods (e.g. SAG-ADMM, SDCA-ADMM and SVRG-ADMM) have made exciting progress such as linear convergence rates for strongly convex problems. However, the best known convergence rate for general convex problems is O(1/T) as opposed to O(1/T 2) of accelerated batch algorithms, where T is the number of iterations. Thus, there still remains a gap in convergence rates between existing stochastic ADMM and batch algorithms. To bridge this gap, we introduce the momentum acceleration trick for batch optimization into the stochastic variance reduced gradient based ADMM (SVRG-ADMM), which leads to an accelerated (ASVRG-ADMM) method. Then we design two different momentum term update rules for strongly convex and general convex cases. We prove that ASVRG-ADMM converges linearly for strongly convex problems. Besides having a low per-iteration complexity as existing stochastic ADMM methods, ASVRG-ADMM improves the convergence rate on general convex problems from O(1/T) to O(1/T2). Our experimental results show the effectiveness of ASVRG-ADMM.

Cite

CITATION STYLE

APA

Liu, Y., Shang, F., & Cheng, J. (2017). Accelerated variance reduced stochastic ADMM. In 31st AAAI Conference on Artificial Intelligence, AAAI 2017 (pp. 2287–2293). AAAI press. https://doi.org/10.1609/aaai.v31i1.10843

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free