Compressed Gradient Tracking for Decentralized Optimization Over General Directed Networks

12Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper, we propose two communication-efficient decentralized optimization algorithms over a general directed multi-agent network. The first algorithm, termed Compressed Push-Pull (CPP), combines the gradient tracking Push-Pull method with communication compression. We show that CPP is applicable to a general class of unbiased compression operators and achieves linear convergence rate for strongly convex and smooth objective functions. The second algorithm is a broadcast-like version of CPP (B-CPP), and it also achieves linear convergence rate under the same conditions on the objective functions. B-CPP can be applied in an asynchronous broadcast setting and further reduce communication costs compared to CPP. Numerical experiments complement the theoretical analysis and confirm the effectiveness of the proposed methods.

Cite

CITATION STYLE

APA

Song, Z., Shi, L., Pu, S., & Yan, M. (2022). Compressed Gradient Tracking for Decentralized Optimization Over General Directed Networks. IEEE Transactions on Signal Processing, 70, 1775–1787. https://doi.org/10.1109/TSP.2022.3160238

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free