Nonintrusive Load Disaggregation Based on Attention Neural Networks

被引:0
|
作者
Lin, Shunfu [1 ]
Yang, Jiayu [1 ]
Li, Yi [1 ]
Shen, Yunwei [1 ]
Li, Fangxing [2 ]
Bian, Xiaoyan [1 ]
Li, Dongdong [1 ]
机构
[1] Shanghai Univ Elect Power, Coll Elect Engn, Shanghai, Peoples R China
[2] Univ Tennessee Knoxville, Dept Elect Engn & Comp Sci, Knoxville, TN USA
基金
中国国家自然科学基金;
关键词
deep learning; dilated convolution; energy disaggregation; nonintrusive load monitoring (NILM); self-attention; sequence-to-point; two-subnetworks; NILM;
D O I
10.1155/etep/3405849
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Nonintrusive load monitoring (NILM), also known as energy disaggregation, infers the energy consumption of individual appliances from household metered electricity data. Recently, NILM has garnered significant attention as it can assist households in reducing energy usage and improving their electricity behaviors. In this paper, we propose a two-subnetwork model consisting of a regression subnetwork and a seq2point-based classification subnetwork for NILM. In the regression subnetwork, stacked dilated convolutions are utilized to extract multiscale features. Subsequently, a self-attention mechanism is applied to the multiscale features to obtain their contextual representations. The proposed model, compared to existing load disaggregation models, has a larger receptive field and can capture crucial information within the data. The study utilizes the low-frequency UK-DALE dataset, released in 2015, containing timestamps, power of various appliances, and device state labels. House1 and House5 are employed as the training set, while House2 data is reserved for testing. The proposed model achieves lower errors for all appliances compared to other algorithms. Specifically, the proposed model shows a 13.85% improvement in mean absolute error (MAE), a 21.27% improvement in signal aggregate error (SAE), and a 26.15% improvement in F1 score over existing algorithms. Our proposed approach evidently exhibits superior disaggregation accuracy compared to existing methods.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Nonintrusive Appliance Load Monitoring Based on Integer Programming
    Inagaki, Shinkichi
    Egami, Tsukasa
    Suzuki, Tatsuya
    Nakamura, Hisahide
    Ito, Koichi
    ELECTRICAL ENGINEERING IN JAPAN, 2011, 174 (02) : 18 - 25
  • [42] Nonintrusive Speech Intelligibility Prediction Using Convolutional Neural Networks
    Andersen, Asger Heidemann
    de Haan, Jan Mark
    Tan, Zheng-Hua
    Jensen, Jesper
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (10) : 1925 - 1939
  • [43] Nonintrusive Appliance Load Monitoring Based on Integer Programming
    Suzuki, Kosuke
    Inagaki, Shinkichi
    Suzuki, Tatsuya
    Nakamura, Hisahide
    Ito, Koichi
    2008 PROCEEDINGS OF SICE ANNUAL CONFERENCE, VOLS 1-7, 2008, : 2626 - +
  • [44] A Unifying Framework of Attention-Based Neural Load Forecasting
    Xiong, Jing
    Zhang, Yu
    IEEE ACCESS, 2023, 11 : 51606 - 51616
  • [45] Nonintrusive appliance load monitoring based on integer programming
    Inagaki, Shinkichi
    Egami, Tsukasa
    Suzuki, Tatsuya
    Nakamura, Hisahide
    Ito, Koichi
    IEEJ Transactions on Power and Energy, 2008, 128 (11) : 1386 - 1392
  • [46] Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks With Attention
    Sehovac, Ljubisa
    Grolinger, Katarina
    IEEE ACCESS, 2020, 8 : 36411 - 36426
  • [47] Attention-based graph neural networks: a survey
    Chengcheng Sun
    Chenhao Li
    Xiang Lin
    Tianji Zheng
    Fanrong Meng
    Xiaobin Rui
    Zhixiao Wang
    Artificial Intelligence Review, 2023, 56 : 2263 - 2310
  • [48] Attention Based Recurrent Neural Networks for Online Advertising
    Zhai, Shuangfei
    Chang, Keng-Hao
    Zhang, Ruofei
    Zhang, Zhongfei
    PROCEEDINGS OF THE 25TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'16 COMPANION), 2016, : 141 - 142
  • [49] Attention-based graph neural networks: a survey
    Sun, Chengcheng
    Li, Chenhao
    Lin, Xiang
    Zheng, Tianji
    Meng, Fanrong
    Rui, Xiaobin
    Wang, Zhixiao
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (SUPPL 2) : 2263 - 2310
  • [50] Attention Based Neural Networks for Wireless Channel Estimation
    Luan, Dianxin
    Thompson, John
    2022 IEEE 95TH VEHICULAR TECHNOLOGY CONFERENCE (VTC2022-SPRING), 2022,