Multi-step learning rule for recurrent neural models:: An application to time series forecasting

被引:17
|
作者
Galván, IM [1 ]
Isasi, P [1 ]
机构
[1] Univ Carlos III Madrid, Dept Comp Sci, Madrid 28911, Spain
关键词
multi-step prediction; neural networks; time series; time series modelling;
D O I
10.1023/A:1011324221407
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-step prediction is a difficult task that has attracted increasing interest in recent years. It tries to achieve predictions several steps ahead into the future starting from current information. The interest in this work is the development of nonlinear neural models for the purpose of building multi-step time series prediction schemes. In that context, the most popular neural models are based on the traditional feedforward neural networks. However, this kind of model may present some disadvantages when a long-term prediction problem is formulated because they are trained to predict only the next sampling time. In this paper, a neural model based on a partially recurrent neural network is proposed as a better alternative. For the recurrent model, a learning phase with the purpose of long-term prediction is imposed, which allows to obtain better predictions of time series in the future. In order to validate the performance of the recurrent neural model to predict the dynamic behaviour of the series in the future, three different data time series have been used as study cases. An artificial data time series, the logistic map, and two real time series, sunspots and laser data. Models based on feedforward neural networks have also been used and compared against the proposed model. The results suggest than the recurrent model can help in improving the prediction accuracy.
引用
收藏
页码:115 / 133
页数:19
相关论文
共 50 条
  • [31] Multi-step time series forecasting on the temperature of lithium-ion batteries
    Wan, Zijing
    Kang, Yilin
    Ou, Renwei
    Xue, Song
    Xu, Dongwei
    Luo, Xiaobing
    JOURNAL OF ENERGY STORAGE, 2023, 64
  • [32] Multi-scale adaptive attention-based time-variant neural networks for multi-step time series forecasting
    Gao, Changxia
    Zhang, Ning
    Li, Youru
    Lin, Yan
    Wan, Huaiyu
    APPLIED INTELLIGENCE, 2023, 53 (23) : 28974 - 28993
  • [33] Multi-scale adaptive attention-based time-variant neural networks for multi-step time series forecasting
    Gao Changxia
    Zhang Ning
    Li Youru
    Lin Yan
    Wan Huaiyu
    Applied Intelligence, 2023, 53 : 28974 - 28993
  • [34] Spatiotemporal graph neural network for multivariate multi-step ahead time-series forecasting of sea temperature
    Kim, Jinah
    Kim, Taekyung
    Ryu, Joon-Gyu
    Kim, Jaeil
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126
  • [35] Adaptive Particle Swarm Optimization Learning in a Time Delayed Recurrent Neural Network for Multi-Step Prediction
    Hatalis, Kostas
    Alnajjab, Basel
    Kishore, Shalinee
    Lamadrid, Alberto
    2014 IEEE SYMPOSIUM ON FOUNDATIONS OF COMPUTATIONAL INTELLIGENCE (FOCI), 2014, : 84 - 91
  • [36] A novel multi-step forecasting strategy for enhancing deep learning models’ performance
    Ioannis E. Livieris
    Panagiotis Pintelas
    Neural Computing and Applications, 2022, 34 : 19453 - 19470
  • [37] A novel multi-step forecasting strategy for enhancing deep learning models' performance
    Livieris, Ioannis E.
    Pintelas, Panagiotis
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (22): : 19453 - 19470
  • [38] Multi-step forecasting of multivariate time series using multi-attention collaborative network
    He, Xiaoyu
    Shi, Suixiang
    Geng, Xiulin
    Yu, Jie
    Xu, Lingyu
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 211
  • [39] Recurrent neural networks that learn multi-step visual routines with reinforcement learning
    Mollard, Sami
    Wacongne, Catherine
    Bohte, Sander M.
    Roelfsema, Pieter R.
    PLOS COMPUTATIONAL BIOLOGY, 2024, 20 (04)
  • [40] Multi-step Forecasting via Multi-task Learning
    Jawed, Shayan
    Rashed, Ahmed
    Schmidt-Thieme, Lars
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 790 - 799