Segmented Frequency-Domain Correlation Prediction Model for Long-Term Time Series Forecasting Using Transformer

被引:0
|
作者
Tong, Haozhuo [1 ]
Kong, Lingyun [1 ]
Liu, Jie [1 ]
Gao, Shiyan [1 ]
Xu, Yilu [1 ]
Chen, Yuezhe
机构
[1] Xijing Univ, Sch Elect Informat, Xian 710000, Peoples R China
关键词
All Open Access; Gold;
D O I
10.1049/2024/2920167
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Long-term time series forecasting has received significant attention from researchers in recent years. Transformer model-based approaches have emerged as promising solutions in this domain. Nevertheless, most existing methods rely on point-by-point self-attention mechanisms or employ transformations, decompositions, and reconstructions of the entire sequence to capture dependencies. The point-by-point self-attention mechanism becomes impractical for long-term time series forecasting due to its quadratic complexity with respect to the time series length. Decomposition and reconstruction methods may introduce information loss, leading to performance bottlenecks in the models. In this paper, we propose a Transformer-based forecasting model called NPformer. Our method introduces a novel multiscale segmented Fourier attention mechanism. By segmenting the long-term time series and performing discrete Fourier transforms on different segments, we aim to identify frequency-domain correlations between these segments. This allows us to capture dependencies more effectively. In addition, we incorporate a normalization module and a desmoothing factor into the model. These components address the problem of oversmoothing that arises in sequence decomposition methods. Furthermore, we introduce an isometry convolution method to enhance the prediction accuracy of the model. The experimental results demonstrate that NPformer outperforms other Transformer-based methods in long-term time series forecasting.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Analysis of series resonant converter using a frequency-domain model
    Crompton Greaves Ltd, Bombay, India
    J Inst Eng India: Electr Eng Div, (204-210):
  • [42] Long-Term Trajectory Prediction Model Based on Transformer
    Tong, Qiang
    Hu, Jinqing
    Chen, Yuli
    Guo, Dongdong
    Liu, Xiulei
    IEEE ACCESS, 2023, 11 : 143695 - 143703
  • [43] TFformer: A time-frequency domain bidirectional sequence-level attention based transformer for interpretable long-term sequence forecasting
    Zhao, Tianlong
    Fang, Lexin
    Ma, Xiang
    Li, Xuemei
    Zhang, Caiming
    PATTERN RECOGNITION, 2025, 158
  • [44] DTSFormer: Decoupled temporal-spatial diffusion transformer for enhanced long-term time series forecasting
    Zhu, Jiaming
    Liu, Dezhi
    Chen, Huayou
    Liu, Jinpei
    Tao, Zhifu
    KNOWLEDGE-BASED SYSTEMS, 2025, 309
  • [45] MEAformer: An all-MLP transformer with temporal external attention for long-term time series forecasting
    Huang, Siyuan
    Liu, Yepeng
    Cui, Haoyi
    Zhang, Fan
    Li, Jinjiang
    Zhang, Xiaofeng
    Zhang, Mingli
    Zhang, Caiming
    INFORMATION SCIENCES, 2024, 669
  • [46] Long-term time series prediction using OP-ELM
    Grigorievskiy, Alexander
    Miche, Yoan
    Ventela, Anne-Mari
    Severin, Eric
    Lendasse, Amaury
    NEURAL NETWORKS, 2014, 51 : 50 - 56
  • [47] Long-term prediction of time series using fuzzy cognitive maps
    Feng, Guoliang
    Zhang, Liyong
    Yang, Jianhua
    Lu, Wei
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2021, 102
  • [48] A modified grey forecasting model for long-term prediction
    Hsu, CC
    Chen, CY
    JOURNAL OF THE CHINESE INSTITUTE OF ENGINEERS, 2003, 26 (03) : 301 - 308
  • [49] Adapt to small-scale and long-term time series forecasting with enhanced multidimensional correlation
    Li, Xinshuai
    Luo, Senlin
    Pan, Limin
    Wu, Zhouting
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [50] Hidformer: Hierarchical dual-tower transformer using multi-scale mergence for long-term time series forecasting
    Liu, Zhaoran
    Cao, Yizhi
    Xu, Hu
    Huang, Yuxin
    He, Qunshan
    Chen, Xinjie
    Tang, Xiaoyu
    Liu, Xinggao
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 239