Asynchronous stochastic frank-wolfe algorithms for non-convex optimization

2Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Asynchronous parallel stochastic optimization for non-convex problems becomes more and more important in machine learning especially due to the popularity of deep learning. The Frank-Wolfe (a.k.a. conditional gradient) algorithms has regained much interest because of its projection-free property and the ability of handling structured constraints. However, our understanding of asynchronous stochastic Frank-Wolfe algorithms is extremely limited especially in the non-convex setting. To address this challenging problem, in this paper, we propose our asynchronous stochastic Frank-Wolfe algorithm (AsySFW) and its variance reduction version (AsySVFW) for solving the constrained non-convex optimization problems. More importantly, we prove the fast convergence rates of AsySFW and AsySVFW in the non-convex setting. To the best of our knowledge, AsySFW and AsySVFW are the first asynchronous parallel stochastic algorithms with convergence guarantees for solving the constrained non-convex optimization problems. The experimental results on real high-dimensional gray-scale images not only confirm the fast convergence of our algorithms, but also show a near-linear speedup on a parallel system with shared memory due to the lock-free implementation.

Cite

CITATION STYLE

APA

Gu, B., Xian, W., & Huang, H. (2019). Asynchronous stochastic frank-wolfe algorithms for non-convex optimization. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 737–743). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/104

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free