CRAformer: A cross-residual attention transformer for solar irradiation multistep forecasting

被引:0
|
作者
Zhang, Zongbin [1 ,2 ]
Huang, Xiaoqiao [1 ,2 ,3 ]
Li, Chengli [1 ,2 ,3 ]
Cheng, Feiyan [1 ,2 ]
Tai, Yonghang [1 ,2 ,3 ]
机构
[1] Yunnan Normal Univ, Engn Res Ctr Photoelect Detect & Percept Technol, Kunming 650500, Peoples R China
[2] Yunnan Normal Univ, Sch Phys & Elect Informat, Kunming 650500, Peoples R China
[3] Yunnan Key Lab Optoelect Informat Technol, Kunming Yunnan 650500, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-step irradiance forecasting; Cross-residual; Convolutional weighting; Dual-output mode; Photovoltaic power generation; HYBRID MODEL; RADIATION;
D O I
10.1016/j.energy.2025.135214
中图分类号
O414.1 [热力学];
学科分类号
摘要
In recent years, solar energy has gained widespread adoption in smart grids due to its safety, environmental friendliness, abundance, and other advantages, driving the application of photovoltaic (PV) power generation technology. Accurately predicting solar irradiance is essential for ensuring the operational stability of PV power systems, making it a critical challenge for maintaining grid security and stability. Although Transformer models in deep learning have achieved significant advancements in solar irradiance forecasting, existing studies often treat cross-batch time-series data (TSD) as independent. By overlooking the complex coupling relationships between different data batches, they fail to fully capture the underlying patterns in TSD under varying conditions. Moreover, handling the long-term dependencies and short-term weather-induced fluctuations inherent in TSD remains difficult. To address these issues, this paper proposes an efficient Transformer model (CRAformer) based on Cross-Residual Attention (CRA) for multi-step solar irradiance forecasting. CRAformer effectively captures the deep coupling relationships within TSD through a residual scoring mechanism, which can dynamically adjust feature weights and balance long-term dependencies with short-term variations. Furthermore, by incorporating a dual-output mode and dual-attention strategy, the model can deconstruct complex data structures and guide the prediction process with greater accuracy. Additionally, the newly designed Convolutional Weighted Fusion Module (CWFM) enhances the model's capability to recognize diverse patterns and characteristics in TSD. By dynamically regulating the information transfer process, the CWFM improves the model's generalization, fitting accuracy, and robustness. To evaluate CRAformer's performance, four prediction tasks with varying time steps (24 h, 48 h, 72 h, 96 h) were designed using irradiance datasets from different locations: Denver, Clark, and Folsom. The experimental results demonstrate that, compared to the second-best model, iTransformer, CRAformer reduces the RMSE by an average of 5.6 %, 3.9 %, and 5.6 % across the four prediction steps for the datasets from Denver, Clark, and Folsom, respectively. These results indicate that CRAformer offers significant advantages in multi-step solar irradiance forecasting, providing a valuable reference for future model optimization.
引用
收藏
页数:22
相关论文
共 9 条
  • [1] GRAformer: A gated residual attention transformer for multivariate time series forecasting
    Yang, Chengcao
    Wang, Yutian
    Yang, Bing
    Chen, Jun
    NEUROCOMPUTING, 2024, 581
  • [2] A Novel Transformer Network With Shifted Window Cross-Attention for Spatiotemporal Weather Forecasting
    Bojesomo, Alabi
    Almarzouqi, Hasan
    Liatsis, Panos
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 45 - 55
  • [3] SHORT TERM SOLAR RADIATION FORECASTING BASED ON ICEEMDAN-LSTM AND RESIDUAL ATTENTION
    Zang H.
    Zhang Y.
    Cheng L.
    Liu J.
    Wei Z.
    Sun G.
    Taiyangneng Xuebao/Acta Energiae Solaris Sinica, 2023, 44 (12): : 175 - 181
  • [4] Attention-Based Models for Multivariate Time Series Forecasting: Multi-step Solar Irradiation Prediction
    Sakib, Sadman
    Mahadi, Mahin K.
    Abir, Samiur R.
    Moon, Al-Muzadded
    Shafiullah, Ahmad
    Ali, Sanjida
    Faisal, Fahim
    Nishat, Mirza M.
    HELIYON, 2024, 10 (06)
  • [5] Probabilistic Multienergy Load Forecasting Based on Hybrid Attention-Enabled Transformer Network and Gaussian Process-Aided Residual Learning
    Zhao, Pengfei
    Hu, Weihao
    Cao, Di
    Zhang, Zhenyuan
    Huang, Yuehui
    Dai, Longcheng
    Chen, Zhe
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (06) : 8379 - 8393
  • [6] COMFormer: Classification of Maternal–Fetal and Brain Anatomy Using a Residual Cross-Covariance Attention Guided Transformer in Ultrasound
    Sarker, Md Mostafa Kamal
    Singh, Vivek Kumar
    Alsharid, Mohammad
    Hernandez-Cruz, Netzahualcoyotl
    Papageorghiou, Aris T.
    Noble, J. Alison
    IEEE TRANSACTIONS ON ULTRASONICS FERROELECTRICS AND FREQUENCY CONTROL, 2023, 70 (11) : 1417 - 1427
  • [7] Residual attention guided vision transformer with acoustic-vibration signal feature fusion for cross-domain fault diagnosis
    Lian, Yan
    Wang, Jinrui
    Li, Zhuoli
    Liu, Wen
    Huang, Limei
    Jiang, Xingxing
    ADVANCED ENGINEERING INFORMATICS, 2025, 64
  • [8] Enhanced visible-infrared person re-identification based on cross-attention multiscale residual vision transformer
    Sarker, Prodip Kumar
    Zhao, Qingjie
    PATTERN RECOGNITION, 2024, 149
  • [9] Ensemble Empirical Mode Decomposition Granger Causality Test Dynamic Graph Attention Transformer Network: Integrating Transformer and Graph Neural Network Models for Multi-Sensor Cross-Temporal Granularity Water Demand Forecasting
    Wu, Wenhong
    Kang, Yunkai
    APPLIED SCIENCES-BASEL, 2024, 14 (08):