In this paper, a novel stratified sampling strategy is designed to accelerate the mini-batch SGD. We derive a new iteration-dependent surrogate which bound the stochastic variance from above. To keep the strata minimizing this surrogate with high probability, a stochastic stratifying algorithm is adopted in an adaptive manner, that is, in each iteration, strata are reconstructed only if an easily verifiable condition is met. Based on this novel sampling strategy, we propose an accelerated minibatch SGD algorithm named SGD-RS. Our theoretical analysis shows that the convergence rate of SGD-RS is superior to the state-of-the-art. Numerical experiments corroborate our theory and demonstrate that SGD-RS achieves at least 3.48-times speed-ups compared to vanilla mini-batch SGD.
CITATION STYLE
Liu, W., Qian, H., Zhang, C., Shen, Z., Xie, J., & Zheng, N. (2020). Accelerating stratified sampling SGD by reconstructing strata. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 2725–2731). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/378
Mendeley helps you to discover research relevant for your work.