Federated Learning (FL) allows clients to form a consortium to train a global model under the orchestration of a central server while keeping data on the local client without sharing it, thus mitigating data privacy issues. However, training a robust global model is challenging since the local data is invisible to the server. The local data of clients are naturally heterogeneous, while some clients can use corrupted data or send malicious updates to interfere with the training process artificially. Meanwhile, communication and computation costs are inevitable challenges in designing a practical FL algorithm. In this paper, to improve the robustness of FL, we propose a Shapley value-inspired adaptive weighting mechanism, which regards the FL training as sequential cooperative games and adjusts clients' weights according to their contributions. We also develop a client sampling strategy based on importance sampling, which can reduce the communication cost by optimizing the variance of the global updates according to the weights of clients. Furthermore, to diminish the computation cost of the server, we propose a weight calculation method by estimating differences between the Shapley value of clients. Our experimental results on several real data sets demonstrate the effectiveness of our approaches.
CITATION STYLE
Sun, Q., Li, X., Zhang, J., Xiong, L., Liu, W., Liu, J., … Ren, K. (2023). ShapleyFL: Robust Federated Learning Based on Shapley Value. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 2096–2108). Association for Computing Machinery. https://doi.org/10.1145/3580305.3599500
Mendeley helps you to discover research relevant for your work.