SAED: self-attentive energy disaggregation

被引:18
|
作者
Virtsionis-Gkalinikis, Nikolaos [1 ]
Nalmpantis, Christoforos [1 ]
Vrakas, Dimitris [1 ]
机构
[1] Aristotle Univ Thessaloniki, Sch Informat, Thessaloniki 54124, Greece
关键词
Energy disaggregation; Non-intrusive load monitoring; Artificial neural networks; Self attention;
D O I
10.1007/s10994-021-06106-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The field of energy disaggregation deals with the approximation of appliance electric consumption using only the aggregate consumption measurement of a mains meter. Recent research developments have used deep neural networks and outperformed previous methods based on Hidden Markov Models. On the other hand, deep learning models are computationally heavy and require huge amounts of data. The main objective of the current paper is to incorporate the attention mechanism into neural networks in order to reduce their computational complexity. For the attention mechanism two different versions are utilized, named Additive and Dot Attention. The experiments show that they perform on par, while the Dot mechanism is slightly faster. The two versions of self-attentive neural networks are compared against two state-of-the-art energy disaggregation deep learning models. The experimental results show that the proposed architecture achieves faster or equal training and inference time and with minor performance drop depending on the device or the dataset.
引用
收藏
页码:4081 / 4100
页数:20
相关论文
共 50 条
  • [21] Self-attentive Rationalization for Interpretable Graph Contrastive Learning
    Li, Sihang
    Luo, Yanchen
    Zhang, An
    Wang, Xiang
    Li, Longfei
    Zhou, Jun
    Chua, Tat-seng
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2025, 19 (02)
  • [22] Self-Attentive Similarity Measurement Strategies in Speaker Diarization
    Lin, Qingjian
    Hou, Yu
    Li, Ming
    INTERSPEECH 2020, 2020, : 284 - 288
  • [23] Explicit Sparse Self-Attentive Network for CTR Prediction
    Luo, Yu
    Peng, Wanwan
    Fan, Youping
    Pang, Hong
    Xu, Xiang
    Wu, Xiaohua
    PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY, 2021, 183 : 690 - 695
  • [24] Improving Disfluency Detection by Self-Training a Self-Attentive Model
    Lou, Paria Jamshid
    Johnson, Mark
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3754 - 3763
  • [25] Detection of replay spoof speech using global self-attentive Teager energy features
    Chen, Ming
    Chen, Xueqin
    Shengxue Xuebao/Acta Acustica, 2024, 49 (05): : 1122 - 1130
  • [26] A self-attentive model for tracing knowledge and engagement in parallel
    Jiang, Hua
    Xiao, Bing
    Luo, Yintao
    Ma, Junliang
    PATTERN RECOGNITION LETTERS, 2023, 165 : 25 - 32
  • [27] Locker: Locally Constrained Self-Attentive Sequential Recommendation
    He, Zhankui
    Zhao, Handong
    Wang, Zhaowen
    Lin, Zhe
    Kale, Ajinkya
    McAuley, Julian
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3088 - 3092
  • [28] Gated Self-attentive Encoder for Neural Machine Translation
    Wei, Xiangpeng
    Hu, Yue
    Xing, Luxi
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2019, PT I, 2019, 11775 : 655 - 666
  • [29] MeSHProbeNet: a self-attentive probe net for MeSH indexing
    Xun, Guangxu
    Jha, Kishlay
    Yuan, Ye
    Wang, Yaqing
    Zhang, Aidong
    BIOINFORMATICS, 2019, 35 (19) : 3794 - 3802
  • [30] SELF-ATTENTIVE SENTIMENTAL SENTENCE EMBEDDING FOR SENTIMENT ANALYSIS
    Lin, Sheng-Chieh
    Su, Wen-Yuh
    Chien, Po-Chuan
    Tsai, Ming-Feng
    Wang, Chuan-Ju
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 1678 - 1682