Boosting MLPs with a Coarsening Strategy for Long-Term Time Series Forecasting

被引:0
|
作者
Bian, Nannan [1 ]
Zhu, Minhong [2 ]
Chen, Li [3 ]
Cai, Weiran [1 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Suzhou 215006, Peoples R China
[2] Soochow Univ, Sch Biol & Basic Med Sci, Suzhou 215006, Peoples R China
[3] Shaanxi Normal Univ, Sch Phys & Informat Technol, Xian 710061, Peoples R China
关键词
time series forecasting; coarsening strategy; pattern extraction;
D O I
10.1007/978-981-97-5678-0_36
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning methods have been exerting their strengths in long-term time series forecasting. However, they often struggle to strike a balance between expressive power and computational efficiency. Resorting to multi-layer perceptrons (MLPs) provides a compromising solution, yet they suffer from two critical problems caused by the intrinsic point-wise mapping mode, in terms of deficient contextual dependencies and inadequate information bottleneck. Here, we propose the Coarsened Perceptron Network (CP-Net), featured by a coarsening strategy that alleviates the above problems associated with the prototype MLPs by forming information granules in place of solitary temporal points. The CP-Net utilizes primarily a two-stage framework for extracting semantic and contextual patterns, which preserves correlations over larger timespans and filters out volatile noises. This is further enhanced by a multi-scale setting, where patterns of diverse granularities are fused towards a comprehensive prediction. Based purely on convolutions of structural simplicity, CP-Net is able tomaintain a linear computational complexity and low runtime, while demonstrates an improvement of 4.1% compared with the SOTA method on seven forecasting benchmarks. Code is available at https://github.com/nannanbian/CPNet
引用
收藏
页码:422 / 433
页数:12
相关论文
共 50 条
  • [31] PWDformer: Deformable transformer for long-term series forecasting
    Wang, Zheng
    Ran, Haowei
    Ren, Jinchang
    Sun, Meijun
    PATTERN RECOGNITION, 2024, 147
  • [32] A long-term multivariate time series forecasting network combining series decomposition and convolutional neural networks
    Wang, Xingyu
    Liu, Hui
    Du, Junzhao
    Dong, Xiyao
    Yang, Zhihan
    APPLIED SOFT COMPUTING, 2023, 139
  • [33] MCNet: Multivariate long-term time series forecasting with local and global context modeling
    Sun, Jiaqi
    Zhai, Junhai
    INFORMATION SCIENCES, 2024, 676
  • [34] Long-Term Data Traffic Forecasting for Network Dimensioning in LTE with Short Time Series
    Gijon, Carolina
    Toril, Matias
    Luna-Ramirez, Salvador
    Mari-Altozano, Maria Luisa
    Ruiz-Aviles, Jose Maria
    ELECTRONICS, 2021, 10 (10)
  • [35] MSDformer: an autocorrelation transformer with multiscale decomposition for long-term multivariate time series forecasting
    Su, Guangyao
    Guan, Yepeng
    APPLIED INTELLIGENCE, 2025, 55 (02)
  • [36] Network traffic forecasting model based on long-term intuitionistic fuzzy time series
    Fan, Xiaoshi
    Wang, Yanan
    Zhang, Mengyu
    INFORMATION SCIENCES, 2020, 506 : 131 - 147
  • [37] Periodformer: An efficient long-term time series forecasting method based on periodic attention
    Liang, Daojun
    Zhang, Haixia
    Yuan, Dongfeng
    Zhang, Minggao
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [38] Physically-guided temporal diffusion transformer for long-term time series forecasting
    Ren, Zeqi
    Yu, Jianbo
    Huang, Jian
    Yang, Xiaofeng
    Leng, Siyang
    Liu, Yuping
    Yan, Shifu
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [39] DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting
    Ji Huang
    Minbo Ma
    Yongsheng Dai
    Jie Hu
    Shengdong Du
    Human-Centric Intelligent Systems, 2023, 3 (3): : 263 - 274
  • [40] From global to local: A lightweight CNN approach for long-term time series forecasting
    Mo, Site
    Yang, Chengteng
    Mo, Yipeng
    Yao, Zuhua
    Li, Bixiong
    Fan, Songhai
    Wang, Haoxin
    COMPUTERS & ELECTRICAL ENGINEERING, 2025, 123