TFformer: A time-frequency domain bidirectional sequence-level attention based transformer for interpretable long-term sequence forecasting

被引:0
|
作者
Zhao, Tianlong [1 ]
Fang, Lexin [1 ]
Ma, Xiang [1 ]
Li, Xuemei [1 ]
Zhang, Caiming [1 ,2 ]
机构
[1] Shandong Univ, Sch Software, Jinan 250101, Peoples R China
[2] Shandong Prov Lab Future Intelligence & Financial, Yantai 264005, Peoples R China
基金
中国国家自然科学基金;
关键词
Long-term forecasting; Interpretable forecasting; Frequency decomposition; Sequential attention; Periodic extension;
D O I
10.1016/j.patcog.2024.110994
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transformer methods have shown strong predictive performance in long-term time series prediction. However, its attention mechanism destroys temporal dependence and has quadratic complexity. This makes prediction processes difficult to interpret, limiting their application in tasks requiring interpretability. To address this issue, this paper proposes a highly interpretable long-term sequence forecasting model, TFformer. TFformer decomposes time series into low frequency trend component and high frequency period component by frequency decomposition, and forecasts them respectively. The periodic information in high-frequency component is enhanced with the sequential frequency attention, and then the temporal patterns of the two components are obtained by feature extraction. According to the period property in time domain, TFformer through periodic extension to predict the future period patterns using sequential periodic matching attention. Finally, the predicted future period pattern and the extracted trend pattern are reconstructed to future series. TFformer provides an interpretable forecasting process with low time complexity, as it retains temporal dependence using sequence-level attentions. TFformer achieves significant prediction performance in both univariate and multivariate forecasting across six datasets. Detailed experimental results and analyses verify the effectiveness and generalization of TFformer.
引用
收藏
页数:14
相关论文
共 35 条
  • [1] BSAformer: bidirectional sequence splitting aggregation attention mechanism for long term series forecasting
    Zhu, Qingbo
    Han, Jialin
    Yang, Sheng
    Xie, Zhiqiang
    Tian, Bo
    Wan, Haibo
    Chai, Kai
    COMPLEX & INTELLIGENT SYSTEMS, 2025, 11 (04)
  • [2] Interpretable Long-Term Forecasting Based on Dynamic Attention in Smart City
    Ma, Changxia
    Xie, Jun
    Yang, Lisha
    Zhong, Zhaoman
    Zhao, Xuefeng
    Hu, Wenbin
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2024, 38 (07)
  • [3] DTAFORMER: Directional Time Attention Transformer For Long-Term Series Forecasting
    Chang, Jiang
    Yue, Luhui
    Liu, Qingshan
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT IV, 2025, 15034 : 162 - 180
  • [4] Periodic Attention-based Stacked Sequence to Sequence framework for long-term travel time prediction
    Huang, Yu
    Dai, Hao
    Tseng, Vincent S.
    KNOWLEDGE-BASED SYSTEMS, 2022, 258
  • [5] TFDNet: Time-Frequency enhanced Decomposed Network for long-term time series forecasting
    Luo, Yuxiao
    Zhang, Songming
    Lyu, Ziyu
    Hu, Yuhan
    PATTERN RECOGNITION, 2025, 162
  • [6] Multivariate Long Sequence Time Series Forecasting Based on Robust Spatiotemporal Attention
    Zhang, Dandan
    Zhang, Zhiqiang
    Wang, Yun
    2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024, 2024,
  • [7] Segmented Frequency-Domain Correlation Prediction Model for Long-Term Time Series Forecasting Using Transformer
    Tong, Haozhuo
    Kong, Lingyun
    Liu, Jie
    Gao, Shiyan
    Xu, Yilu
    Chen, Yuezhe
    IET SOFTWARE, 2024, 2024
  • [8] DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
    Ji Huang
    Minbo Ma
    Yongsheng Dai
    Jie Hu
    Shengdong Du
    Human-Centric Intelligent Systems, 2023, 3 (3): : 263 - 274
  • [9] Long-Term Forecasting Using MAMTF: A Matrix Attention Model Based on the Time and Frequency Domains
    Guo, Kaixin
    Yu, Xin
    APPLIED SCIENCES-BASEL, 2024, 14 (07):
  • [10] Long-term forecasting using transformer based on multiple time series
    Lee, Jaeyong
    Kim, Hyun Jun
    Lim, Changwon
    KOREAN JOURNAL OF APPLIED STATISTICS, 2024, 37 (05) : 583 - 598