Convergence of an online gradient method for BP neural networks with stochastic inputs

被引:0
|
作者
Li, ZX
Wu, W [1 ]
Feng, GR
Lu, HF
机构
[1] Dalian Univ Technol, Dept Appl Math, Dalian 116023, Peoples R China
[2] Dalian Maritime Univ, Dept Math, Dalian 116000, Peoples R China
[3] Shanghai Jiao Tong Univ, Dept Math, Shanghai 200000, Peoples R China
来源
ADVANCES IN NATURAL COMPUTATION, PT 1, PROCEEDINGS | 2005年 / 3610卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An online gradient method for BP neural networks is presented and discussed. The input training examples are permuted stochastically in each cycle of iteration. A monotonicity and a weak convergence of deterministic nature for the method are proved.
引用
收藏
页码:720 / 729
页数:10
相关论文
共 50 条
  • [41] Convergence of Batch Gradient Method Based on the Entropy Error Function for Feedforward Neural Networks
    Yan Xiong
    Xin Tong
    Neural Processing Letters, 2020, 52 : 2687 - 2695
  • [42] Convergence of Batch Gradient Method Based on the Entropy Error Function for Feedforward Neural Networks
    Xiong, Yan
    Tong, Xin
    NEURAL PROCESSING LETTERS, 2020, 52 (03) : 2687 - 2695
  • [43] A Convergence Analysis of Gradient Descent on Graph Neural Networks
    Awasthi, Pranjal
    Das, Abhimanyu
    Gollapudi, Sreenivas
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [44] Convergence of gradient descent for learning linear neural networks
    Nguegnang, Gabin Maxime
    Rauhut, Holger
    Terstiege, Ulrich
    ADVANCES IN CONTINUOUS AND DISCRETE MODELS, 2024, 2024 (01):
  • [45] Boundedness and convergence of online gradient method with penalty and momentum
    Shao, Hongmei
    Zheng, Gaofeng
    NEUROCOMPUTING, 2011, 74 (05) : 765 - 770
  • [46] Convergence of Asynchronous Distributed Gradient Methods Over Stochastic Networks
    Xu, Jinming
    Zhu, Shanying
    Soh, Yeng Chai
    Xie, Lihua
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2018, 63 (02) : 434 - 448
  • [47] Convergence Stability of Spiking Neural Networks with Stochastic Fluctuations
    Zhao, Chenhui
    He, Shan
    Li, Lin
    Guo, Donghui
    PROCEEDINGS OF 2019 IEEE 13TH INTERNATIONAL CONFERENCE ON ANTI-COUNTERFEITING, SECURITY, AND IDENTIFICATION (IEEE-ASID'2019), 2019, : 163 - 167
  • [48] Study on improving the convergence speed of BP wavelet neural networks
    Li, Jinping
    He, Miao
    Liu, Mingjun
    Yang, Bo
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2002, 15 (01):
  • [49] ON CONVERGENCE OF THE STOCHASTIC SUBGRADIENT METHOD WITH ONLINE STEPSIZE RULES
    RUSZCZYNSKI, A
    SYSKI, W
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 1986, 114 (02) : 512 - 527
  • [50] Calibrated Stochastic Gradient Descent for Convolutional Neural Networks
    Zhuo, Li'an
    Zhang, Baochang
    Chen, Chen
    Ye, Qixiang
    Liu, Jianzhuang
    Doermann, David
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 9348 - 9355