Vertical federated learning (VFL) is an emerging paradigm that allows different parties (e.g., organizations or enterprises) to collaboratively build machine learning models with privacy protection. In the training phase, VFL only exchanges the intermediate statistics, i.e., forward activations and backward derivatives, across parties to compute model gradients. Nevertheless, due to its geo-distributed nature, VFL training usually suffers from the low WAN bandwidth. In this paper, we introduce CELU-VFL, a novel and efficient VFL training framework that exploits the local update technique to reduce the cross-party communication rounds. CELU-VFL caches the stale statistics and reuses them to estimate model gradients without exchanging the ad hoc statistics. Significant techniques are proposed to improve the convergence performance. First, to handle the stochastic variance problem, we propose a uniform sampling strategy to fairly choose the stale statistics for local updates. Second, to harness the errors brought by the staleness, we devise an instance weighting mechanism that measures the reliability of the estimated gradients. Theoretical analysis proves that CELU-VFL achieves a similar sub-linear convergence rate as vanilla VFL training but requires much fewer communication rounds. Empirical results on both public and real-world workloads validate that CELU-VFL can be up to six times faster than the existing works.
CITATION STYLE
Fu, F., Miao Xupeng.Miao@Pku.Edu.Cn, X., Jiang, J., Xue, H., & Cui Bin.Cui@Pku.Edu.Cn, B. (2022). Towards Communication-efficient Vertical Federated Learning Training via Cache-enabled Local Updates. In Proceedings of the VLDB Endowment (Vol. 15, pp. 2111–2120). VLDB Endowment. https://doi.org/10.14778/3547305.3547316
Mendeley helps you to discover research relevant for your work.