A efficient and robust privacy-preserving framework for cross-device federated learning

23Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

To ensure no private information is leaked in the aggregation phase in federated learning (FL), many frameworks use homomorphic encryption (HE) to mask local model updates. However, the heavy overheads of these frameworks make them unsuitable for cross-device FL, where the clients are a huge number of mobile and edge devices with limited computing resources. Even worse, some of them also fail to manage the dynamic changes of clients. To overcome these shortcomings, we propose a threshold multi-key HE scheme tMK-CKKS and design an efficient and robust privacy-preserving FL framework. Robustness means that our framework allows clients to join in or drop out during the training process. Besides, because our tMK-CKKS scheme can pack multiple messages in a single ciphertext, our framework significantly reduces the computation and communication overhead. Moreover, the threshold mechanism in tMK-CKKS ensures that our framework can resist collusion attacks between the server and no more than t (threshold value) curious internal clients. Finally, we implement our framework in FedML and conduct extensive experiments to evaluate our framework. Utility evaluations on 6 benchmark datasets show that our framework can protect privacy without sacrificing the model accuracy. Efficiency evaluations on 4 typical deep learning models demonstrate that: our framework can speed up the computation by at least 1.21× over xMK-CKKS-based framework, 15.84× over Batchcrypt-based framework, and 20.30× over CRT-Paillier-based framework. Our framework can reduce the communication burden by at least 8.61 MB over Batchcrypt-based framework, 35.36 MB over xMK-CKKS-based framework and 42.58 MB over CRT-Paillier-based framework. The advantages in both computation and communication expand with the size of deep learning models.

Cite

CITATION STYLE

APA

Du, W., Li, M., Wu, L., Han, Y., Zhou, T., & Yang, X. (2023). A efficient and robust privacy-preserving framework for cross-device federated learning. Complex and Intelligent Systems, 9(5), 4923–4937. https://doi.org/10.1007/s40747-023-00978-9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free