SAED: self-attentive energy disaggregation

被引:18
|
作者
Virtsionis-Gkalinikis, Nikolaos [1 ]
Nalmpantis, Christoforos [1 ]
Vrakas, Dimitris [1 ]
机构
[1] Aristotle Univ Thessaloniki, Sch Informat, Thessaloniki 54124, Greece
关键词
Energy disaggregation; Non-intrusive load monitoring; Artificial neural networks; Self attention;
D O I
10.1007/s10994-021-06106-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The field of energy disaggregation deals with the approximation of appliance electric consumption using only the aggregate consumption measurement of a mains meter. Recent research developments have used deep neural networks and outperformed previous methods based on Hidden Markov Models. On the other hand, deep learning models are computationally heavy and require huge amounts of data. The main objective of the current paper is to incorporate the attention mechanism into neural networks in order to reduce their computational complexity. For the attention mechanism two different versions are utilized, named Additive and Dot Attention. The experiments show that they perform on par, while the Dot mechanism is slightly faster. The two versions of self-attentive neural networks are compared against two state-of-the-art energy disaggregation deep learning models. The experimental results show that the proposed architecture achieves faster or equal training and inference time and with minor performance drop depending on the device or the dataset.
引用
收藏
页码:4081 / 4100
页数:20
相关论文
共 50 条
  • [31] SAFE: Self-Attentive Function Embeddings for Binary Similarity
    Massarelli, Luca
    Di Luna, Giuseppe Antonio
    Petroni, Fabio
    Baldoni, Roberto
    Querzoni, Leonardo
    DETECTION OF INTRUSIONS AND MALWARE, AND VULNERABILITY ASSESSMENT (DIMVA 2019), 2019, 11543 : 309 - 329
  • [32] Self-Attentive Models for Real-Time Malware Classification
    Lu, Qikai
    Zhang, Hongwen
    Kinawi, Husam
    Niu, Di
    IEEE ACCESS, 2022, 10 : 95970 - 95985
  • [33] Global-Locally Self-Attentive Dialogue State Tracker
    Zhong, Victor
    Xiong, Caiming
    Socher, Richard
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1458 - 1467
  • [34] Verifiable and Energy Efficient Medical Image Analysis with Quantised Self-attentive Deep Neural Networks
    Sathish, Rakshith
    Khare, Swanand
    Sheet, Debdoot
    DISTRIBUTED, COLLABORATIVE, AND FEDERATED LEARNING, AND AFFORDABLE AI AND HEALTHCARE FOR RESOURCE DIVERSE GLOBAL HEALTH, DECAF 2022, FAIR 2022, 2022, 13573 : 178 - 189
  • [35] A Self-Attentive Model with Gate Mechanism for Spoken Language Understanding
    Li, Changliang
    Li, Liang
    Qi, Ji
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3824 - 3833
  • [36] Self-Attentive Contrastive Learning for Conditioned Periocular and Face Biometrics
    Ng, Tiong-Sik
    Chai, Jacky Chen Long
    Low, Cheng-Yaw
    Beng Jin Teoh, Andrew
    IEEE Transactions on Information Forensics and Security, 2024, 19 : 3251 - 3264
  • [37] Self-Attentive Attributed Network Embedding Through Adversarial Learning
    Yu, Wenchao
    Cheng, Wei
    Aggarwal, Charu
    Zong, Bo
    Chen, Haifeng
    Wang, Wei
    2019 19TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2019), 2019, : 758 - 767
  • [38] Self-Attentive Recommendation for Multi-Source Review Package
    Chen, Pin-Yu
    Chen, Yu-Hsiu
    Shuai, Hong-Han
    Chang, Yung-Ju
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [39] SELF-ATTENTIVE NETWORKS FOR ONE-SHOT IMAGE RECOGNITION
    Fang, Pin
    Wang, Yisen
    Luo, Yuan
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2019, : 934 - 939
  • [40] Self-Attentive Contrastive Learning for Conditioned Periocular and Face Biometrics
    Ng, Tiong-Sik
    Chai, Jacky Chen Long
    Low, Cheng-Yaw
    Teoh, Andrew Beng Jin
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2024, 19 : 3251 - 3264