Vortex search optimization algorithm for training of feed-forward neural network

被引:0
|
作者
Tahir Sağ
Zainab Abdullah Jalil Jalil
机构
[1] Selcuk University,Department of Computer Engineering
关键词
FNN; Classification; Optimization; Training neural-networks; Vortex search;
D O I
暂无
中图分类号
学科分类号
摘要
Training of feed-forward neural-networks (FNN) is a challenging nonlinear task in supervised learning systems. Further, derivative learning-based methods are frequently inadequate for the training phase and cause a high computational complexity due to the numerous weight values that need to be tuned. In this study, training of neural-networks is considered as an optimization process and the best values of weights and biases in the structure of FNN are determined by Vortex Search (VS) algorithm. The VS algorithm is a novel metaheuristic optimization method recently developed, inspired by the vortex shape of stirred liquids. VS fulfills the training task to set the optimal weights and biases stated in a matrix. In this context, the proposed VS-based learning method for FNNs (VS-FNN) is conducted to analyze the effectiveness of the VS algorithm in FNN training for the first time in the literature. The proposed method is applied to six datasets whose names are 3-bit XOR, Iris Classification, Wine-Recognition, Wisconsin-Breast-Cancer, Pima-Indians-Diabetes, and Thyroid-Disease. The performance of the proposed algorithm is analyzed by comparing with other training methods based on Artificial Bee Colony Optimization (ABC), Particle Swarm Optimization (PSO), Simulated Annealing (SA), Genetic Algorithm (GA) and Stochastic Gradient Descent (SGD) algorithms. The experimental results show that VS-FNN is generally leading and competitive. It is also said that VS-FNN can be used as a capable tool for neural networks.
引用
收藏
页码:1517 / 1544
页数:27
相关论文
共 50 条
  • [1] Vortex search optimization algorithm for training of feed-forward neural network
    Sag, Tahir
    Jalil, Zainab Abdullah Jalil
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (05) : 1517 - 1544
  • [2] A modified weighted chimp optimization algorithm for training feed-forward neural network
    Atta, Eman A.
    Ali, Ahmed F.
    Elshamy, Ahmed A.
    PLOS ONE, 2023, 18 (03):
  • [3] An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training
    Socha, Krzysztof
    Blum, Christian
    NEURAL COMPUTING & APPLICATIONS, 2007, 16 (03): : 235 - 247
  • [4] An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training
    Krzysztof Socha
    Christian Blum
    Neural Computing and Applications, 2007, 16 : 235 - 247
  • [5] Guided Convergence for Training Feed-forward Neural Network using Novel Gravitational Search Optimization
    Saha, Sankhadip
    Chakraborty, Dwaipayan
    Dutta, Oindrilla
    2014 INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING AND APPLICATIONS (ICHPCA), 2014,
  • [6] A Hybrid Particle Swarm Optimization for Feed-Forward Neural Network Training
    Niu, Ben
    Li, Li
    ADVANCED INTELLIGENT COMPUTING THEORIES AND APPLICATIONS, PROCEEDINGS: WITH ASPECTS OF ARTIFICIAL INTELLIGENCE, 2008, 5227 : 494 - 501
  • [7] Feed-Forward Neural Networks Training with Hybrid Taguchi Vortex Search Algorithm for Transmission Line Fault Classification
    Coban, Melih
    Tezcan, Suleyman Sungur
    MATHEMATICS, 2022, 10 (18)
  • [8] An improved butterfly optimization algorithm for training the feed-forward artificial neural networks
    Irmak, Busra
    Karakoyun, Murat
    Gulcu, Saban
    SOFT COMPUTING, 2023, 27 (07) : 3887 - 3905
  • [9] A Modified Invasive Weed Optimization Algorithm for Training of Feed-Forward Neural Networks
    Giri, Ritwik
    Chowdhury, Aritra
    Ghosh, Arnob
    Das, Swagatam
    Abraham, Ajith
    Snasel, Vaclav
    IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010, : 3166 - 3173
  • [10] An improved butterfly optimization algorithm for training the feed-forward artificial neural networks
    Büşra Irmak
    Murat Karakoyun
    Şaban Gülcü
    Soft Computing, 2023, 27 : 3887 - 3905