A constructive algorithm for feedforward neural networks with incremental training

被引:42
|
作者
Liu, DR [1 ]
Chang, TS
Zhang, Y
机构
[1] Univ Illinois, Dept Elect & Comp Engn, Chicago, IL 60607 USA
[2] Univ Calif Davis, Dept Elect & Comp Engn, Davis, CA 95616 USA
基金
美国国家科学基金会;
关键词
constructive algorithm; feedforward neural networks; incremental training; linear programming; quadratic programming;
D O I
10.1109/TCSI.2002.805733
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We develop, in this brief, a new constructive learning algorithm for feedforward neural networks. We employ an incremental training procedure where training patterns are learned one by one. Our algorithm starts with a single training pattern and a single hidden-layer neuron. During the course of neural network training, when the algorithm gets stuck in a local minimum, we will attempt to escape from the local minimum by using the weight scaling technique. It is only after several consecutive failed attempts in escaping from a local minimum that will we allow the network to grow by adding a hidden-layer neuron. At this stage, we employ an optimization procedure based on quadratic/linear programming to select initial weights for the newly added neuron. Our optimization procedure tends to make the network reach the error tolerance with no or little training after adding a hidden-layer neuron. Our simulation results indicate that the present constructive algorithm can obtain neural networks very close to minimal structures (with the least possible number of hidden-layer neurons) and that convergence (to a solution) in neural network training can be guaranteed. We tested our algorithm extensively using a widely used benchmark problem, i.e., the parity problem.
引用
收藏
页码:1876 / 1879
页数:4
相关论文
共 50 条
  • [31] Training feedforward neural networks: An algorithm giving improved generalization
    Lee, CW
    NEURAL NETWORKS, 1997, 10 (01) : 61 - 68
  • [32] AN IMPROVED GENETIC ALGORITHM FOR TRAINING LAYERED FEEDFORWARD NEURAL NETWORKS
    刘平
    程翼宇
    Journal of Zhejiang University Science, 2000, (03) : 85 - 89
  • [33] A hybrid linear/nonlinear training algorithm for feedforward neural networks
    McLoone, S
    Brown, MD
    Irwin, G
    Lightbody, G
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (04): : 669 - 684
  • [34] A Second Order Training Algorithm for Multilayer Feedforward Neural Networks
    谭营
    何振亚
    邓超
    Journal of Southeast University(English Edition), 1997, (01) : 32 - 36
  • [35] ON TRAINING FEEDFORWARD NEURAL NETWORKS
    KAK, S
    PRAMANA-JOURNAL OF PHYSICS, 1993, 40 (01): : 35 - 42
  • [36] A new constructive algorithm for designing and training artificial neural networks
    Sattar, Md. Abdus
    Islam, Md. Monirul
    Murase, Kazuyuki
    NEURAL INFORMATION PROCESSING, PART I, 2008, 4984 : 317 - +
  • [37] An Efficient Hybrid Incremental Algorithm for Complex-Valued Feedforward Neural Networks
    Zhang, Shufang
    Huang, He
    Han, Ziyang
    2019 9TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY (ICIST2019), 2019, : 327 - 332
  • [38] An Interpretable Constructive Algorithm for Incremental Random Weight Neural Networks and Its Application
    Nan, Jing
    Dai, Wei
    Yuan, Guan
    Zhou, Ping
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (12) : 13622 - 13632
  • [39] A constructive approach of modified standard backpropagation algorithm with optimum initialization for feedforward neural networks
    Gunaseeli, N.
    Karthikeyan, N.
    ICCIMA 2007: INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND MULTIMEDIA APPLICATIONS, VOL I, PROCEEDINGS, 2007, : 325 - 331
  • [40] Grafting constructive algorithm in feedforward neural network learning
    Zhang, Siyuan
    Xie, Linbo
    APPLIED INTELLIGENCE, 2023, 53 (10) : 11553 - 11570