Guided Convergence for Training Feed-forward Neural Network using Novel Gravitational Search Optimization

被引:0
|
作者
Saha, Sankhadip [1 ]
Chakraborty, Dwaipayan [2 ]
Dutta, Oindrilla [1 ]
机构
[1] NetajiSubhash Engn Coll, Dept Elect Engn, Kolkata, India
[2] NetajiSubhash Engn Coll, Dept Elect & Instru Engn, Kolkata, India
关键词
Meta-heuristic; optimization; GSA; feed-forward neural network; local minima; ALGORITHM; BACKPROPAGATION;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Training of feed-forward neural network using stochastic optimizationtechniques recently gained a lot of importance invarious pattern recognition and data miningapplications because of its capability of escaping local minima trap. However such techniques may suffer fromslow and poor convergence. This fact inspires us to work onmeta-heuristic optimization technique for training the neural network. In this respect, to train the neural network, we focus on implementing thegravitational search algorithm(GSA) which is based on the Newton's law of motion principle and the interaction of masses. GSA has good ability to search for the global optimum, but it may suffer from slow searching speed in the lastiterations. Our work is directed towards the smart convergence by modifying the original GSA and also guiding the algorithm to make it immune to local minima trap. Results on various benchmark datasets prove the robustness of the modified algorithm.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks
    Karaboga, Dervis
    Akay, Bahriye
    Ozturk, Celal
    MODELING DECISIONS FOR ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2007, 4617 : 318 - +
  • [42] As experiment with feed-forward neural network for speech recognition
    Jelinek, B
    Juhar, J
    Cizmar, A
    STATE OF THE ART IN COMPUTATIONAL INTELLIGENCE, 2000, : 308 - 313
  • [43] An incremental learning preprocessor for feed-forward neural network
    Piyabute Fuangkhon
    Artificial Intelligence Review, 2014, 41 : 183 - 210
  • [44] Finding an Optimal Configuration of the Feed-forward Neural Network
    Strba, Radoslav
    Stolfa, Jakub
    Stolfa, Svatopluk
    INFORMATION MODELLING AND KNOWLEDGE BASES XXVII, 2016, 280 : 199 - 206
  • [45] Enhancing Feed-Forward Neural Network in Image Classification
    Daday, Mark Jovic A.
    Fajardo, Arnel C.
    Medina, Ruji P.
    2019 2ND INTERNATIONAL CONFERENCE ON COMPUTING AND BIG DATA (ICCBD 2019), 2019, : 86 - 90
  • [46] Response analysis of feed-forward neural network predictors
    Varone, B
    Tanskanen, JMA
    Ovaska, SJ
    1997 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS I - V: VOL I: PLENARY, EXPERT SUMMARIES, SPECIAL, AUDIO, UNDERWATER ACOUSTICS, VLSI; VOL II: SPEECH PROCESSING; VOL III: SPEECH PROCESSING, DIGITAL SIGNAL PROCESSING; VOL IV: MULTIDIMENSIONAL SIGNAL PROCESSING, NEURAL NETWORKS - VOL V: STATISTICAL SIGNAL AND ARRAY PROCESSING, APPLICATIONS, 1997, : 3309 - 3312
  • [48] Process parameter optimization using a feed-forward neural network for direct metal laser sintering process
    Ning, Y
    Fuh, JYH
    Wong, YS
    Loh, HT
    ICMA 2002 INTERNATIONAL CONFERENCE ON MANUFACTURING AUTOMATION, 2002, : 475 - 482
  • [49] Quantum implementation of an artificial feed-forward neural network
    Tacchino, Francesco
    Barkoutsos, Panagiotis
    Macchiavello, Chiara
    Tavernelli, Ivano
    Gerace, Dario
    Bajoni, Daniele
    QUANTUM SCIENCE AND TECHNOLOGY, 2020, 5 (04)
  • [50] Study of Full Interval Feed-forward Neural Network
    Guan Shou-ping
    Liang Rong-ye
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 2652 - 2655