Transformers in Time Series: A Survey

被引:0
|
作者
Wen, Qingsong [1 ]
Zhou, Tian [2 ]
Zhang, Chaoli [2 ]
Chen, Weiqi [2 ]
Ma, Ziqing [2 ]
Yan, Junchi [3 ]
Sun, Liang [1 ]
机构
[1] Alibaba Grp, DAMO Acad, Bellevue, WA 98004 USA
[2] Alibaba Grp, DAMO Acad, Hangzhou, Peoples R China
[3] Shanghai Jiao Tong Univ, Dept CSE, MoE Key Lab Artificial Intelligence, Shanghai, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transformers have achieved superior performances in many tasks in natural language processing and computer vision, which also triggered great interest in the time series community. Among multiple advantages of Transformers, the ability to capture long-range dependencies and interactions is especially attractive for time series modeling, leading to exciting progress in various time series applications. In this paper, we systematically review Transformer schemes for time series modeling by highlighting their strengths as well as limitations. In particular, we examine the development of time series Transformers in two perspectives. From the perspective of network structure, we summarize the adaptations and modifications that have been made to Transformers in order to accommodate the challenges in time series analysis. From the perspective of applications, we categorize time series Transformers based on common tasks including forecasting, anomaly detection, and classification. Empirically, we perform robust analysis, model size analysis, and seasonal-trend decomposition analysis to study how Transformers perform in time series. Finally, we discuss and suggest future directions to provide useful research guidance.
引用
收藏
页码:6778 / 6786
页数:9
相关论文
共 50 条
  • [31] Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting
    Liu, Yong
    Wu, Haixu
    Wang, Jianmin
    Long, Mingsheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [32] Categorization of phenotype trajectories utilizing transformers on clinical time-series
    Fredriksen, Helge
    Burman, Per Joel
    Woldaregay, Ashenafi
    Mikalsen, Karl Oyvind
    Nymo, Stale
    PROCEEDINGS OF THE 2024 9TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING TECHNOLOGIES, ICMLT 2024, 2024, : 311 - 316
  • [33] Efficient Transformers: A Survey
    Tay, Yi
    Dehghani, Mostafa
    Bahri, Dara
    Metzler, Donald
    ACM COMPUTING SURVEYS, 2023, 55 (06)
  • [34] Video Transformers: A Survey
    Selva, Javier
    Johansen, Anders S.
    Escalera, Sergio
    Nasrollahi, Kamal
    Moeslund, Thomas B.
    Clapes, Albert
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 12922 - 12943
  • [35] Transformers in Vision: A Survey
    Khan, Salman
    Naseer, Muzammal
    Hayat, Munawar
    Zamir, Syed Waqas
    Khan, Fahad Shahbaz
    Shah, Mubarak
    ACM COMPUTING SURVEYS, 2022, 54 (10S)
  • [36] A Survey of Visual Transformers
    Liu, Yang
    Zhang, Yao
    Wang, Yixin
    Hou, Feng
    Yuan, Jin
    Tian, Jiang
    Zhang, Yang
    Shi, Zhongchao
    Fan, Jianping
    He, Zhiqiang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (06) : 7478 - 7498
  • [37] Time Series Representation Learning: A Survey on Deep Learning Techniques for Time Series Forecasting
    Schmieg, Tobias
    Lanquillon, Carsten
    ARTIFICIAL INTELLIGENCE IN HCI, PT I, AI-HCI 2024, 2024, 14734 : 422 - 435
  • [38] Self-Supervised Pretraining of Transformers for Satellite Image Time Series Classification
    Yuan, Yuan
    Lin, Lei
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2021, 14 : 474 - 487
  • [39] Interpretable wind speed prediction with multivariate time series and temporal fusion transformers
    Wu, Binrong
    Wang, Lin
    Zeng, Yu-Rong
    ENERGY, 2022, 252
  • [40] Beyond Gut Feel: Using Time Series Transformers to Find Investment Gems
    Cao, Lele
    Halvardsson, Gustaf
    McCornack, Andrew
    von Ehrenheim, Vilhelm
    Herman, Pawel
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT IX, 2024, 15024 : 373 - 388