Foreformer: an enhanced transformer-based framework for multivariate time series forecasting

被引:14
|
作者
Yang, Ye [1 ]
Lu, Jiangang [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Control Sci & Engn, State Key Lab Ind Control Technol, Hangzhou 310027, Peoples R China
[2] Zhejiang Lab, Hangzhou 311121, Peoples R China
关键词
Multivariate time series forecasting; Attention mechanism; Deep learning; Multi-resolution; Static covariate; Transformer; CONVOLUTIONAL NETWORKS;
D O I
10.1007/s10489-022-04100-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multivariate time series forecasting (MTSF) has been extensively studied throughout years with ubiquitous applications in finance, traffic, environment, etc. Recent investigations have demonstrated the potential of Transformer to improve the forecasting performance. Transformer, however, has limitations that prohibit it from being directly applied to MTSF, such as insufficient extraction of temporal patterns at different time scales, extraction of irrelevant information in the self-attention, and no targeted processing of static covariates. Motivated by above, an enhanced Transformer-based framework for MTSF is proposed, named Foreformer, with three distinctive characteristics: (i) a multi-temporal resolution module that deeply captures temporal patterns at different scales, (ii) an explicit sparse attention mechanism forces model to prioritize the most contributive components, and (iii) a static covariates processing module for nonlinear processing of static covariates. Extensive experiments on three real-world datasets demonstrate that Foreformer outperforms existing methodologies, making it a reliable approach for MTSF tasks.
引用
收藏
页码:12521 / 12540
页数:20
相关论文
共 50 条
  • [31] A Joint Time-Frequency Domain Transformer for multivariate time series forecasting
    Chen, Yushu
    Liu, Shengzhuo
    Yang, Jinzhe
    Jing, Hao
    Zhao, Wenlai
    Yang, Guangwen
    NEURAL NETWORKS, 2024, 176
  • [32] A systematic review for transformer-based long-term series forecasting
    Su, Liyilei
    Zuo, Xumin
    Li, Rui
    Wang, Xin
    Zhao, Heng
    Huang, Bingding
    ARTIFICIAL INTELLIGENCE REVIEW, 2025, 58 (03)
  • [33] Multi-scale convolution enhanced transformer for multivariate long-term time series forecasting
    Li, Ao
    Li, Ying
    Xu, Yunyang
    Li, Xuemei
    Zhang, Caiming
    NEURAL NETWORKS, 2024, 180
  • [34] MetaTrans-FSTSF: A Transformer-Based Meta-Learning Framework for Few-Shot Time Series Forecasting in Flood Prediction
    Jiang, Jiange
    Chen, Chen
    Lackinger, Anna
    Li, Huimin
    Li, Wan
    Pei, Qingqi
    Dustdar, Schahram
    REMOTE SENSING, 2025, 17 (01)
  • [35] Time Series Forecasting Based on Convolution Transformer
    Wang, Na
    Zhao, Xianglian
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2023, E106D (05) : 976 - 985
  • [36] MCformer: Multivariate Time Series Forecasting With Mixed-Channels Transformer
    Han, Wenyong
    Zhu, Tao
    Chen, Liming
    Ning, Huansheng
    Luo, Yang
    Wan, Yaping
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (17): : 28320 - 28329
  • [37] GRAformer: A gated residual attention transformer for multivariate time series forecasting
    Yang, Chengcao
    Wang, Yutian
    Yang, Bing
    Chen, Jun
    NEUROCOMPUTING, 2024, 581
  • [38] Sparse transformer with local and seasonal adaptation for multivariate time series forecasting
    Zhang, Yifan
    Wu, Rui
    Dascalu, Sergiu M.
    Harris Jr, Frederick C.
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [39] Transformer-based multivariate time series anomaly detection using inter-variable attention mechanism
    Kang, Hyeongwon
    Kang, Pilsung
    KNOWLEDGE-BASED SYSTEMS, 2024, 290
  • [40] An enhanced transformer-based framework for interpretable code clone detection
    Nashaat, Mona
    Amin, Reem
    Eid, Ahmad Hosny
    Abdel-Kader, Rabab F.
    JOURNAL OF SYSTEMS AND SOFTWARE, 2025, 222