Towards energy-aware federated learning on battery-powered clients

15Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Federated learning (FL) is a newly emerged branch of AI that facilitates edge devices to collaboratively train a global machine learning model without centralizing data and with privacy by default. However, despite the remarkable advancement, this paradigm comes with various challenges. Specifically, in large-scale deployments, client heterogeneity is the norm which impacts training quality such as accuracy, fairness, and time. Moreover, energy consumption across these battery-constrained devices is largely unexplored and a limitation for wide-adoption of FL. To address this issue, we develop EAFL, an energy-aware FL selection method that considers energy consumption to maximize the participation of heterogeneous target devices. EAFL is a power-aware training algorithm that cherry-picks clients with higher battery levels in conjunction with its ability to maximize the system efficiency. Our design jointly minimizes the time-to-accuracy and maximizes the remaining on-device battery levels. EAFL improves the testing model accuracy by up to 85% and decreases the drop-out of clients by up to 2.45X.1

Cite

CITATION STYLE

APA

Arouj, A., & Abdelmoniem, A. M. (2022). Towards energy-aware federated learning on battery-powered clients. In FedEdge 2022 - Proceedings of the 2022 1st ACM Workshop on Data Privacy and Federated Learning Technologies for Mobile Edge Network (pp. 7–12). Association for Computing Machinery, Inc. https://doi.org/10.1145/3556557.3557952

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free