A fast neural network learning with guaranteed convergence to zero system error

被引:0
|
作者
Ajimura, T
Yamada, I
Sakaniwa, K
机构
关键词
neural network; local minimum; dimension expansion; zero system error;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
It is thought that we have generally succeeded in establishing learning algorithms for neural networks, such as the back-propagation algorithm. However two major issues remain to be solved. First, there are possibilities of being trapped at a local minimum in learning. Second, the convergence rate is too slow. Chang and Ghaffar proposed to add a new hidden node, whenever stopping at a local minimum, and restart to train the new net until the error converges to zero. Their method designs newly generated weights so that the new net after introducing a new hidden node has less error than that at the original local minimum. In this paper, we propose a new method that improves their convergence rate. Our proposed method is expected to give a Lower system error and a larger error gradient magnitude than their method at a starling point of the new net, which leads to a faster convergence rate. Actually it is shown through numerical examples that the proposed method gives a much better performance than the conventional Chang and Ghaffar's method.
引用
收藏
页码:1433 / 1439
页数:7
相关论文
共 50 条
  • [21] Fast learning neural network with modified neurons
    Hwang, RC
    Chen, YJ
    Chuang, SJ
    Huang, HC
    Chang, WD
    THIRD INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND APPLICATIONS, VOL 1, PROCEEDINGS, 2005, : 313 - 318
  • [22] NEURAL NETWORK MODEL FOR FAST LEARNING AND RETRIEVAL
    ARSENAULT, HH
    MACUKOW, B
    OPTICAL ENGINEERING, 1989, 28 (05) : 506 - 512
  • [23] Hierarchical fast learning artificial neural network
    Ping, WL
    Phuan, ATL
    Jian, X
    Proceedings of the International Joint Conference on Neural Networks (IJCNN), Vols 1-5, 2005, : 3300 - 3305
  • [24] Fast and convergence-guaranteed algorithm for linear separation
    ZHANG David
    Science China(Information Sciences), 2010, 53 (04) : 729 - 737
  • [25] Fast and convergence-guaranteed algorithm for linear separation
    ZhiYong Liu
    David Zhang
    YuGang Li
    Science China Information Sciences, 2010, 53 : 729 - 737
  • [26] Fast and convergence-guaranteed algorithm for linear separation
    Liu ZhiYong
    Zhang, David
    Li YuGang
    SCIENCE CHINA-INFORMATION SCIENCES, 2010, 53 (04) : 729 - 737
  • [27] DYNAMIC SYSTEM-IDENTIFICATION BY NEURAL-NETWORK - A NEW, FAST LEARNING-METHOD BASED ON ERROR BACK-PROPAGATION
    PAL, C
    HAGIWARA, I
    KAYABA, N
    MORISHITA, S
    JOURNAL OF INTELLIGENT MATERIAL SYSTEMS AND STRUCTURES, 1994, 5 (01) : 127 - 135
  • [28] Designing an Adaptive Neural Network Controller for TORA System by using Feedback Error Learning
    Taheri, Alireza
    Tavan, Mehdi
    Teshnehlab, Mohammad
    2010 CHINESE CONTROL AND DECISION CONFERENCE, VOLS 1-5, 2010, : 2259 - +
  • [29] A fast learning algorithm of neural networks by changing error functions
    Jiang, MH
    Deng, BX
    Wang, B
    Bo, Z
    PROCEEDINGS OF 2003 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS & SIGNAL PROCESSING, PROCEEDINGS, VOLS 1 AND 2, 2003, : 249 - 252
  • [30] Adaptive Fuzzy Control With Guaranteed Convergence of Optimal Approximation Error
    Pan, Yongping
    Er, Meng Joo
    Huang, Daoping
    Wang, Qinruo
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2011, 19 (05) : 807 - 818