Communication-efficient and Scalable Decentralized Federated Edge Learning

14Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Federated Edge Learning (FEL) is a distributed Machine Learning (ML) framework for collaborative training on edge devices. FEL improves data privacy over traditional centralized ML model training by keeping data on the devices and only sending local model updates to a central coordinator for aggregation. However, challenges still remain in existing FEL architectures where there is high communication overhead between edge devices and the coordinator. In this paper, we present a working prototype of blockchain-empowered and communication-efficient FEL framework, which enhances the security and scalability towards large-scale implementation of FEL.

Cite

CITATION STYLE

APA

Yapp, A. Z. H., Koh, H. S. N., Lai, Y. T., Kang, J., Li, X., Ng, J. S., … Niyato, D. (2021). Communication-efficient and Scalable Decentralized Federated Edge Learning. In IJCAI International Joint Conference on Artificial Intelligence (pp. 5032–5035). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2021/720

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free