Nonlinear Bayesian Filters for Training Recurrent Neural Networks

被引:0
|
作者
Arasaratnam, Ienkaran [1 ]
Haykin, Simon [1 ]
机构
[1] McMaster Univ, Dept Elect & Comp Engn, Cognit Syst Lab, Hamilton, ON L8S 4K1, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper; we present nonlinear Bayesian filters for training recurrent neural networks with a special emphasis oil a novel, more accurate, derivative-free member of the approximate Bayesian filter family called the cubature Kalman filter. We discuss the theory of Bayesian filters, which is rooted in the state-space modeling of the dynamic system in question and the linear estimation principle. For improved numerical stability and optimal performance during training period; a number of techniques of how to tune Bayesian filters is suggested. We compare the predictability of various Bayesian filter-trained recurrent neural networks using a chaotic time-series. From the empirical results; we conclude that the performance may be greatly improved by the new square-root cubature Kalman filter.
引用
收藏
页码:12 / 33
页数:22
相关论文
共 50 条
  • [41] Modeling Geomagnetospheric Disturbances with Sequential Bayesian Recurrent Neural Networks
    Ouarbya, Lahcen
    Mirikitani, Derrick T.
    NEURAL INFORMATION PROCESSING, PT 1, PROCEEDINGS, 2009, 5863 : 91 - 99
  • [42] Scalable Bayesian Learning of Recurrent Neural Networks for Language Modeling
    Gan, Zhe
    Li, Chunyuan
    Chen, Changyou
    Pu, Yunchen
    Su, Qinliang
    Carin, Lawrence
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 321 - 331
  • [43] Multi-input single-output nonlinear adaptive digital filters using recurrent neural networks
    Lu, JM
    Lin, H
    Wang, XQ
    Yahagi, T
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2001, E84A (08) : 1942 - 1950
  • [44] Training recurrent neural networks with Leap-frog
    Holm, JEW
    Kotze, NJH
    IEEE INTERNATIONAL SYMPOSIUM ON INDUSTRIAL ELECTRONICS (ISIE 98) - PROCEEDINGS, VOLS 1 AND 2, 1998, : 99 - 104
  • [45] Training recurrent neural networks using a hybrid algorithm
    Ben Nasr, Mounir
    Chtourou, Mohamed
    NEURAL COMPUTING & APPLICATIONS, 2012, 21 (03): : 489 - 496
  • [46] Efficient and effective training of sparse recurrent neural networks
    Shiwei Liu
    Iftitahu Ni’mah
    Vlado Menkovski
    Decebal Constantin Mocanu
    Mykola Pechenizkiy
    Neural Computing and Applications, 2021, 33 : 9625 - 9636
  • [47] Optimal Training Sequences for Locally Recurrent Neural Networks
    Patan, Krzysztof
    Patan, Maciej
    ARTIFICIAL NEURAL NETWORKS - ICANN 2009, PT I, 2009, 5768 : 80 - 89
  • [48] Training recurrent neural networks using a hybrid algorithm
    Mounir Ben Nasr
    Mohamed Chtourou
    Neural Computing and Applications, 2012, 21 : 489 - 496
  • [49] AN AUGMENTED LAGRANGIAN METHOD FOR TRAINING RECURRENT NEURAL NETWORKS
    Wang, Yue
    Zhang, Chao
    Chen, Xiaojun
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2025, 47 (01): : C22 - C51
  • [50] TRAINING FULLY RECURRENT NEURAL NETWORKS WITH COMPLEX WEIGHTS
    KECHRIOTIS, G
    MANOLAKOS, ES
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING, 1994, 41 (03): : 235 - 238