Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation

2Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

Federated learning (FL) allows for the decentralized training of a global model on edge devices without transferring data samples, thus preserving privacy. Due to the ubiquitous wearable devices and mobile devices with health applications, FL has shown promise in the medical field for applications such as medical imaging, disease diagnosis, and electronic health record (EHR) analysis. However, slower edge devices with limited resources can slow down the training process. To address this issue and increase efficiency, we propose the use of Asynchronous Federated Learning with Knowledge Distillation (AsyncFedKD). AsyncFedKD asynchronously trains a lightweight global student model using a pre-trained teacher model, preventing a decrease in training efficiency due to slow edge devices. The knowledge distillation aspect of AsyncFedKD effectively compresses the size of model parameters for efficient communication during training. AsyncFedKD has been tested on a sensitive mammography cancer dataset and achieved an accuracy of 88% on the global model.

Cite

CITATION STYLE

APA

Mohammed, M. N., Zhang, X., Valero, M., & Xie, Y. (2023). Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation. In Proceedings - 2023 IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies, CHASE 2023 (pp. 207–208). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1145/3580252.3589436

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free