Enhancing multivariate, multi-step residential load forecasting with spatiotemporal graph attention-enabled transformer

被引:0
|
作者
Zhao, Pengfei [1 ]
Hu, Weihao [1 ]
Cao, Di [1 ,2 ]
Zhang, Zhenyuan [1 ]
Liao, Wenlong [3 ]
Chen, Zhe [4 ]
Huang, Qi [1 ,5 ]
机构
[1] Univ Elect Sci & Technol China, Sch Mech & Elect Engn, Chengdu, Peoples R China
[2] UESTC Guangdong, Inst Elect & Informat Engn, Dongguan, Peoples R China
[3] Ecole Polytech Fed Lausanne EPFL, Wind Engn & Renewable Energy Lab, CH-1015 Lausanne, Switzerland
[4] Aalborg Univ, Dept Energy Technol, Aalborg, Denmark
[5] Southwest Univ Sci & Technol, Sch Informat Engn, Mianyang, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Residential load forecasting; Spatiotemporal modeling; Deep neural network; TIME-SERIES; DEEP;
D O I
10.1016/j.ijepes.2024.110074
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Short-term residential load forecasting (STRLF) holds great significance for the stable and economic operation of distributed power systems. Different households in the same region may exhibit similar consumption patterns owing to the analogous environmental parameters. Incorporating the spatiotemporal correlations can enhance the load forecasting performance of individual households. To this end, a spatiotemporal graph attention (STGA)enabled Transformer is proposed for multivariate, multi -step residential load forecasting in this paper. Specifically, the multiple residential loads are cast to a graph and a Transformer with a graph sequence -to -sequence (Seq2Seq) structure is employed to model the multi -step load forecasting problem. Gated fusion -based STGA blocks are embedded in the encoder and decoder of the Transformer to extract dynamic spatial correlations and non-linear temporal patterns among multiple residential loads. A transform attention block is further designed to transfer historical graph observations into future graph predictions and alleviate the error propagation between the encoder and decoder. The embedding of multiple attention modules in the Seq2Seq framework allows us to capture the spatiotemporal correlations between residents and achieve confident inference of load values several steps ahead. Numerical simulations on residential data from three different regions demonstrate that the developed Transformer method improves multi -step load forecasting by 14.7% at least, compared to the state-ofthe-art benchmarks.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Spatial and Temporal Attention-Enabled Transformer Network for Multivariate Short-Term Residential Load Forecasting
    Zhao, Hongshan
    Wu, Yuchen
    Ma, Libo
    Pan, Sichao
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [2] Interpretable LSTM Based on Mixture Attention Mechanism for Multi-Step Residential Load Forecasting
    Xu, Chongchong
    Li, Chaojie
    Zhou, Xiaojun
    ELECTRONICS, 2022, 11 (14)
  • [3] Spatiotemporal graph neural network for multivariate multi-step ahead time-series forecasting of sea temperature
    Kim, Jinah
    Kim, Taekyung
    Ryu, Joon-Gyu
    Kim, Jaeil
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126
  • [4] Graph Transformer and LSTM Attention for VNF Multi-Step Workload Prediction in SFC
    Wu, Yu
    Liu, Jiayi
    Wang, Chen
    Xie, Xuemei
    Shi, Guangming
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (04): : 4480 - 4493
  • [5] Enhancing Wind Power Forecast Precision via Multi-head Attention Transformer: An Investigation on Single-step and Multi-step Forecasting
    Sarkar, Md Rasel
    Anavatti, Sreenatha G.
    Dam, Tanmoy
    Pratama, Mahardhika
    Al Kindhi, Berlian
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [6] Multi-step forecasting of multivariate time series using multi-attention collaborative network
    He, Xiaoyu
    Shi, Suixiang
    Geng, Xiulin
    Yu, Jie
    Xu, Lingyu
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 211
  • [7] Probabilistic Multienergy Load Forecasting Based on Hybrid Attention-Enabled Transformer Network and Gaussian Process-Aided Residual Learning
    Zhao, Pengfei
    Hu, Weihao
    Cao, Di
    Zhang, Zhenyuan
    Huang, Yuehui
    Dai, Longcheng
    Chen, Zhe
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (06) : 8379 - 8393
  • [8] Spatiotemporal Graph Attention Network modeling for multi-step passenger demand prediction at multi-zone level
    Dong, Chengxiang
    Zhang, Kunpeng
    Wei, Xin
    Wang, Yongchao
    Yang, Yuhui
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2022, 603
  • [9] Attention-Based Models for Multivariate Time Series Forecasting: Multi-step Solar Irradiation Prediction
    Sakib, Sadman
    Mahadi, Mahin K.
    Abir, Samiur R.
    Moon, Al-Muzadded
    Shafiullah, Ahmad
    Ali, Sanjida
    Faisal, Fahim
    Nishat, Mirza M.
    HELIYON, 2024, 10 (06)
  • [10] Multi-attention Generative Adversarial Network for multi-step vegetation indices forecasting using multivariate time series
    Ferchichi, Aya
    Ben Abbes, Ali
    Barra, Vincent
    Rhif, Manel
    Farah, Imed Riadh
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 128