Heterogeneous double populations based hybrid genetic algorithm design for training feedforward neural networks

被引:0
|
作者
Zhang, Li Feng [1 ]
He, Rong [1 ]
Yan, Meng Ling [1 ]
机构
[1] Renmin Univ China, Sch Informat, Beijing 100872, Peoples R China
关键词
feedforward neural network; training; permutation problem; genetic algorithm; least squares; EVOLUTION;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Genetic algorithms (GA) has been extensively applied to address the shortcomings of gradient based leaning methods in training feedforward neural networks (NN). However, the complicated properties of NN training, such as context dependence problem between neurons and permutation problem of genetic representation, will cause difficulties in efficiently implementing conventional GAs. In the present study, a novel hybrid GA design is proposed to overcome these problems. First, for the sake of eliminating the context dependence, the new method adopts GA and least squares estimator to separately optimize the neurons in hidden and output layers. Second, in order to completely avoid the permutation problem, the proposed GA design employs two heterogeneous populations that evolve in company but respectively learn the optimal combinations and parameters of hidden neuron. Finally, experimental studies encouragingly show that, in comparison with five well-known conventional approaches, the new training method displays a much better approximation and generalization capabilities in nonlinear static and dynamic modeling, especially for the observed signals corrupted with large measurement noises.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] A parallel algorithm for gradient training of feedforward neural networks
    Hanzalek, Z
    PARALLEL COMPUTING, 1998, 24 (5-6) : 823 - 839
  • [32] A Modified Backpropagation Training Algorithm for Feedforward Neural Networks*
    T. Kathirvalavakumar
    P. Thangavel
    Neural Processing Letters, 2006, 23 : 111 - 119
  • [33] A novel global hybrid algorithm for feedforward neural networks
    Li, Hongru
    Li, Hailong
    Du, Yina
    ADVANCES IN NEURAL NETWORKS - ISNN 2007, PT 3, PROCEEDINGS, 2007, 4493 : 9 - +
  • [34] Training Feedforward Neural Networks by Pruning Algorithm Based on Grey Incidence Analysis
    Xiong, Yan
    Wang, Li
    Li, Dawei
    2008 INTERNATIONAL SYMPOSIUM ON INTELLIGENT INFORMATION TECHNOLOGY APPLICATION, VOL III, PROCEEDINGS, 2008, : 535 - 539
  • [35] Training Feedforward Neural Networks using Hybrid Flower Pollination-Gravitational Search Algorithm
    Chakraborty, Dwaipayan
    Saha, Sankhadip
    Maity, Samaresh
    2015 1ST INTERNATIONAL CONFERENCE ON FUTURISTIC TRENDS ON COMPUTATIONAL ANALYSIS AND KNOWLEDGE MANAGEMENT (ABLAZE), 2015, : 292 - 297
  • [36] Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm
    Mirjalili, SeyedAli
    Hashim, Siti Zaiton Mohd
    Sardroudi, Hossein Moradian
    APPLIED MATHEMATICS AND COMPUTATION, 2012, 218 (22) : 11125 - 11137
  • [37] An Optimal PID Control Algorithm for Training Feedforward Neural Networks
    Jing, Xingjian
    Cheng, Li
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2013, 60 (06) : 2273 - 2283
  • [38] Feedforward neural networks training with optimal bounded ellipsoid algorithm
    Rubio Avila, Jose De Jesus
    Ramirez, Andres Ferreyra
    Aviles-Cruz, Carlos
    PROCEEDINGS OF THE 9TH WSEAS INTERNATIONAL CONFERENCE ON NEURAL NETWORKS (NN' 08): ADVANCED TOPICS ON NEURAL NETWORKS, 2008, : 174 - 180
  • [39] The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks
    Frean, Marcus
    NEURAL COMPUTATION, 1990, 2 (02) : 198 - 209
  • [40] A New Variant of the GQR Algorithm for Feedforward Neural Networks Training
    Bilski, Jaroslaw
    Kowalczyk, Bartosz
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING (ICAISC 2021), PT I, 2021, 12854 : 41 - 53