Attention-Based Deep Learning Approach for Nonintrusive and Simultaneous State Detection of Multiple Appliances in Smart Buildings

被引:3
|
作者
Dash, Suryalok [1 ]
Sahoo, Nirod Chandra [1 ]
机构
[1] Indian Inst Technol Bhubaneswar, Sch Elect Sci, Bhubaneswar 752050, India
关键词
Computational modeling; Buildings; Task analysis; Feature extraction; Convolution; Aggregates; Convolutional neural networks; Appliance identification; attention models; deep learning (DL); dilated convolution; multilabel classification;
D O I
10.1109/JESTIE.2023.3333308
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Nonintrusive appliance state detection techniques estimate the operating state of appliances in a building using the building's aggregate energy consumption information only. Modern deep learning (DL) approaches have recently emerged as superior solutions to the above task. These approaches deploy individual models corresponding to the appliances whose states are to be identified. Though this solution enables the model to learn the appliance behavior accurately, it poses an additional burden on the computing device, say the smart meter, in terms of memory and computational time requirements. This article addresses the above problem by formulating the state detection task as a multilabel classification problem, where a single model predicts the operating state of multiple appliances. In particular, a novel, lightweight DL model consisting of dilated and causal convolution with multihead attention is proposed for efficient appliance state prediction. The dilated and causal convolution layer automatically extracts useful features from the aggregate data, and the attention layer uses those features selectively to learn the appliance states. The performance of the proposed model is validated in multiple scenarios using actual energy data collected from different buildings. The test results prove the model's feasibility and emerge as superior to various state-of-the-art multilabel classification techniques. Further, the model's benefit is highlighted by investigating a few ablation studies, computational complexities, and the effect of historical aggregate energy data on the model's performance.
引用
收藏
页码:1248 / 1258
页数:11
相关论文
共 50 条
  • [41] Hybrid Attention-based Approach for Arabic Paraphrase Detection
    Mahmoud, Adnen
    Zrigui, Mounir
    APPLIED ARTIFICIAL INTELLIGENCE, 2021, 35 (15) : 1271 - 1286
  • [42] A Federated Learning Approach to Anomaly Detection in Smart Buildings
    Sater, Raed Abdel
    Ben Hamza, A.
    ACM TRANSACTIONS ON INTERNET OF THINGS, 2021, 2 (04):
  • [43] An attention-based automatic vulnerability detection approach with GGNN
    Gaigai Tang
    Lin Yang
    Long Zhang
    Weipeng Cao
    Lianxiao Meng
    Hongbin He
    Hongyu Kuang
    Feng Yang
    Huiqiang Wang
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 3113 - 3127
  • [44] A novel transformer attention-based approach for sarcasm detection
    Khan, Shumaila
    Qasim, Iqbal
    Khan, Wahab
    Aurangzeb, Khursheed
    Khan, Javed Ali
    Anwar, Muhammad Shahid
    EXPERT SYSTEMS, 2025, 42 (01)
  • [45] Attention-based deep learning for accurate cell image analysis
    Gao, Xiangrui
    Zhang, Fan
    Guo, Xueyu
    Yao, Mengcheng
    Wang, Xiaoxiao
    Chen, Dong
    Zhang, Genwei
    Wang, Xiaodong
    Lai, Lipeng
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [46] Generalized attention-based deep multi-instance learning
    Lu Zhao
    Liming Yuan
    Kun Hao
    Xianbin Wen
    Multimedia Systems, 2023, 29 : 275 - 287
  • [47] Indoor Scene Recognition: An Attention-Based Approach Using Feature Selection-Based Transfer Learning and Deep Liquid State Machine
    Surendran, Ranjini
    Chihi, Ines
    Anitha, J.
    Hemanth, D. Jude
    ALGORITHMS, 2023, 16 (09)
  • [48] Generalized attention-based deep multi-instance learning
    Zhao, Lu
    Yuan, Liming
    Hao, Kun
    Wen, Xianbin
    MULTIMEDIA SYSTEMS, 2023, 29 (01) : 275 - 287
  • [49] Attention-Based Deep Reinforcement Learning for Edge User Allocation
    Chang, Jiaxin
    Wang, Jian
    Li, Bing
    Zhao, Yuqi
    Li, Duantengchuan
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (01): : 590 - 604
  • [50] Attention-based Deep Learning Model for Text Readability Evaluation
    Sun, Yuxuan
    Chen, Keying
    Sun, Lin
    Hu, Chenlu
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,