Abstract
Non-intrusive load monitoring (NILM) is a computational technique to allow appliance-level energy disaggregation for sustainable energy management. Most NILM models require considerable training data to capture sufficient appliance signatures for robust model fitting. However, local on-site training cannot satisfy that requirement due to limited data availability. It is thus conceivable to perform data collaboration among different stakeholders. Unfortunately, current collaborative learning approaches rely on deep learning, encryption, and differential privacy techniques associated with either expensive computation or inefficient communication. In this paper, we propose a cost-effective collaborative learning framework, Fed-GBM (Federated Gradient Boosting Machines), consisting of two-stage voting and node-level parallelism, to address the problems in co-modelling for NILM. Through extensive experiments on real-world residential datasets, Fed-GBM shows remarkable performance on convergence, accuracy, computation and communication efficiency. The impact of hyper-parameters in Fed-GBM is also extensively studied to guide better practical use.
Author supplied keywords
Cite
CITATION STYLE
Chang, X., Li, W., & Zomaya, A. Y. (2022). Fed-GBM: A Cost-effective Federated Gradient Boosting Tree for Non-Intrusive Load Monitoring. In e-Energy 2022 - Proceedings of the 2022 13th ACM International Conference on Future Energy Systems (pp. 63–75). Association for Computing Machinery, Inc. https://doi.org/10.1145/3538637.3538840
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.