Nonintrusive Load Monitoring based on Sequence-to-sequence Model With Attention Mechanism

被引:0
|
作者
Wang K. [1 ]
Zhong H. [1 ]
Yu N. [2 ]
Xia Q. [1 ]
机构
[1] State Key Laboratory of Control and Simulation of Power Systems and Generation Equipments, Department of Electrical Engineering, Tsinghua University, Haidian District, Beijing
[2] University of California, Riverside, 92521, CA
来源
Zhongguo Dianji Gongcheng Xuebao/Proceedings of the Chinese Society of Electrical Engineering | 2019年 / 39卷 / 01期
基金
中国国家自然科学基金;
关键词
Attention mechanism; Deep learning; Nonintrusive load monitoring (NILM); Sequence-to-sequence (seq2seq);
D O I
10.13334/j.0258-8013.pcsee.181123
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Nonintrusive load monitoring (NILM) is one of the key applications of big data analytics in smart power distribution systems for end-use customers. A successful implementation of nonintrusive load monitoring can improve the knowledge of load, and has great potential in increasing demand side response. Traditional nonintrusive load monitoring algorithms have long suffered from the problems of high misjudgment rate and low accuracy of disaggregated power value. To address these problems, the deep learning framework was adopted. Specifically, a nonintrusive load monitoring model based on sequence-to-sequence (seq2seq) model with attention mechanism was proposed. The model first embeds the input active power time sequence into a high dimensional vector, extracts information with a long short term memory (LSTM)-based encoder, and then selects the most relevant information to decode and reaches the final disaggregation results with a decoder wrapped by attention mechanism. Compared with existing models, the proposed deep learning network structure increases model's ability to extract and utilize information dramatically. The proposed model was tested on the REFITPowerData dataset, and compared with the state-of-the-art model. © 2019 Chin. Soc. for Elec. Eng.
引用
收藏
页码:75 / 83
页数:8
相关论文
共 26 条
  • [21] Schuster M., Paliwal K.K., Bidirectional recurrent neural networks, IEEE Transactions on Signal Processing, 45, 11, pp. 2673-2681, (1997)
  • [22] Chen L., Wang Z., Wang G., Application of LSTM networks in short-term power load forecasting under the deep learning framework, Electric Power Information and Communication Technology, 15, 5, pp. 8-11, (2017)
  • [23] Bahdanau D., Cho K., Bengio Y., Neural machine translation by jointly learning to align and translate, (2014)
  • [24] Srivastava N., Hinton G., Krizhevsky A., Et al., Dropout: a simple way to prevent neural networks from overfitting, The Journal of Machine Learning Research, 15, 1, pp. 1929-1958, (2014)
  • [25] Ng A.Y., Feature selection, L1 vs. L2 regularization, and rotational invariance, Proceedings of the Twenty-first International Conference on Machine Learning, (2004)
  • [26] Chen X., Kan B., Liu G., The latest development of GPU and its prospective application in power system, Electric Power Information and Communication Technology, 16, 3, pp. 16-25, (2018)