Federated learning (FL) allows for the decentralized training of a global model on edge devices without transferring data samples, thus preserving privacy. Due to the ubiquitous wearable devices and mobile devices with health applications, FL has shown promise in the medical field for applications such as medical imaging, disease diagnosis, and electronic health record (EHR) analysis. However, slower edge devices with limited resources can slow down the training process. To address this issue and increase efficiency, we propose the use of Asynchronous Federated Learning with Knowledge Distillation (AsyncFedKD). AsyncFedKD asynchronously trains a lightweight global student model using a pre-trained teacher model, preventing a decrease in training efficiency due to slow edge devices. The knowledge distillation aspect of AsyncFedKD effectively compresses the size of model parameters for efficient communication during training. AsyncFedKD has been tested on a sensitive mammography cancer dataset and achieved an accuracy of 88% on the global model.
CITATION STYLE
Mohammed, M. N., Zhang, X., Valero, M., & Xie, Y. (2023). Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation. In Proceedings - 2023 IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies, CHASE 2023 (pp. 207–208). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1145/3580252.3589436
Mendeley helps you to discover research relevant for your work.