Non-intrusive load disaggregation based on a multi-scale attention residual network

3Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Non-intrusive load disaggregation (NILD) is of great significance to the development of smart grids. Current energy disaggregation methods extract features from sequences, and this process easily leads to a loss of load features and difficulties in detecting, resulting in a low recognition rate of low-use electrical appliances. To solve this problem, a non-intrusive sequential energy disaggregation method based on a multi-scale attention residual network is proposed. Multi-scale convolutions are used to learn features, and the attention mechanism is used to enhance the learning ability of load features. The residual learning further improves the performance of the algorithm, avoids network degradation, and improves the precision of load decomposition. The experimental results on two benchmark datasets show that the proposed algorithm has more advantages than the existing algorithms in terms of load disaggregation accuracy and judgments of the on/off state, and the attention mechanism can further improve the disaggregation accuracy of low-frequency electrical appliances.

Cite

CITATION STYLE

APA

Weng, L., Zhang, X., Qian, J., Xia, M., Xu, Y., & Wang, K. (2020). Non-intrusive load disaggregation based on a multi-scale attention residual network. Applied Sciences (Switzerland), 10(24), 1–17. https://doi.org/10.3390/app10249132

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free