Improving Multi-Step Prediction of Learned Time Series Models

被引:0
|
作者
Venkatraman, Arun [1 ]
Hebert, Martial [1 ]
Bagnell, J. Andrew [1 ]
机构
[1] Carnegie Mellon Univ, Inst Robot, Pittsburgh, PA 15213 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most typical statistical and machine learning approaches to time series modeling optimize a single-step prediction error. In multiple-step simulation, the learned model is iteratively applied, feeding through the previous output as its new input. Any such predictor however, inevitably introduces errors, and these compounding errors change the input distribution for future prediction steps, breaking the train-test i.i.d assumption common in supervised learning. We present an approach that reuses training data to make a no-regret learner robust to errors made during multi-step prediction. Our insight is to formulate the problem as imitation learning; the training data serves as a "demonstrator" by providing corrections for the errors made during multi-step prediction. By this reduction of multi-step time series prediction to imitation learning, we establish theoretically a strong performance guarantee on the relation between training error and the multi-step prediction error. We present experimental results of our method, DAD, and show significant improvement over the traditional approach in two notably different domains, dynamic system modeling and video texture prediction.
引用
收藏
页码:3024 / 3030
页数:7
相关论文
共 50 条
  • [21] Multi-step Time Series Forecasting of Electric Load Using Machine Learning Models
    Masum, Shamsul
    Liu, Ying
    Chiverton, John
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2018, PT I, 2018, 10841 : 148 - 159
  • [22] Multi-step learning rule for recurrent neural models:: An application to time series forecasting
    Galván, IM
    Isasi, P
    NEURAL PROCESSING LETTERS, 2001, 13 (02) : 115 - 133
  • [23] Multi-step Learning Rule for Recurrent Neural Models: An Application to Time Series Forecasting
    Inés M. Galván
    Pedro Isasi
    Neural Processing Letters, 2001, 13 : 115 - 133
  • [24] Multi-step prediction of zero series and gap series of Riemann zeta function
    Chen, Guohai
    Guo, Guiqiang
    Yang, Kaisheng
    Yang, Dixiong
    RESULTS IN PHYSICS, 2021, 27
  • [25] Single-step and Multi-step Time Series Prediction for Urban Temperature Based on LSTM Model of TensorFlow
    Zhang, Wen Yue
    Xie, Jia Feng
    Wan, Guo Chun
    Tong, Mei Song
    2021 PHOTONICS & ELECTROMAGNETICS RESEARCH SYMPOSIUM (PIERS 2021), 2021, : 1531 - 1535
  • [26] An encoder-decoder architecture with Fourier attention for chaotic time series multi-step prediction
    Fu, Ke
    Li, He
    Shi, Xiaotian
    APPLIED SOFT COMPUTING, 2024, 156
  • [27] Computational Efficiency of Multi-Step Learning Echo State Networks for Nonlinear Time Series Prediction
    Akiyama, Takanori
    Tanaka, Gouhei
    IEEE Access, 2022, 10 : 28535 - 28544
  • [28] Multi-step local prediction model with partial least squares regression for chaotic time series
    Liu, Zunxiong
    Cheng, Quanhua
    Zhang, Deyun
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2006, 13E : 1972 - 1975
  • [29] Analysis on Characteristics of Multi-Step Learning Echo State Networks for Nonlinear Time Series Prediction
    Akiyama, Takanori
    Tanaka, Gouhei
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [30] Multi-step ahead prediction of taxi demand using time-series and textual data
    Markou, Ioulia
    Rodrigues, Filipe
    Pereira, Francisco C.
    URBAN MOBILITY - SHAPING THE FUTURE TOGETHER, 2019, 41 : 540 - 544