A Taming unbalanced training workloads in deep learning with partial collective operations

53Citations
Citations of this article
46Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Load imbalance pervasively exists in distributed deep learning training systems, either caused by the inherent imbalance in learned tasks or by the system itself. Traditional synchronous Stochastic Gradient Descent (SGD) achieves good accuracy for a wide variety of tasks, but relies on global synchronization to accumulate the gradients at every training step. In this paper, we propose eager-SGD, which relaxes the global synchronization for decentralized accumulation. To implement eager-SGD, we propose to use two partial collectives: solo and majority. With solo allreduce, the faster processes contribute their gradients eagerly without waiting for the slower processes, whereas with majority allreduce, at least half of the participants must contribute gradients before continuing, all without using a central parameter server. We theoretically prove the convergence of the algorithms and describe the partial collectives in detail. Experiments are conducted on a variety of neural networks and datasets. The results on load-imbalanced environments show that eager-SGD achieves 2.64 × speedup (ResNet-50 on ImageNet) over the asynchronous centralized SGD, and achieves 1.29 × speedup (ResNet-50 on ImageNet) and 1.27× speedup (LSTM on UCF101) over the state-of-the-art synchronous decentralized SGDs, without losing accuracy.

Cite

CITATION STYLE

APA

Li, S., Ben-Nun, T., Girolamo, S. D., Alistarh, D., & Hoefler, T. (2020). A Taming unbalanced training workloads in deep learning with partial collective operations. In Proceedings of the ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming, PPOPP (pp. 45–61). Association for Computing Machinery. https://doi.org/10.1145/3332466.3374528

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free