Vortex search optimization algorithm for training of feed-forward neural network

被引:0
|
作者
Tahir Sağ
Zainab Abdullah Jalil Jalil
机构
[1] Selcuk University,Department of Computer Engineering
关键词
FNN; Classification; Optimization; Training neural-networks; Vortex search;
D O I
暂无
中图分类号
学科分类号
摘要
Training of feed-forward neural-networks (FNN) is a challenging nonlinear task in supervised learning systems. Further, derivative learning-based methods are frequently inadequate for the training phase and cause a high computational complexity due to the numerous weight values that need to be tuned. In this study, training of neural-networks is considered as an optimization process and the best values of weights and biases in the structure of FNN are determined by Vortex Search (VS) algorithm. The VS algorithm is a novel metaheuristic optimization method recently developed, inspired by the vortex shape of stirred liquids. VS fulfills the training task to set the optimal weights and biases stated in a matrix. In this context, the proposed VS-based learning method for FNNs (VS-FNN) is conducted to analyze the effectiveness of the VS algorithm in FNN training for the first time in the literature. The proposed method is applied to six datasets whose names are 3-bit XOR, Iris Classification, Wine-Recognition, Wisconsin-Breast-Cancer, Pima-Indians-Diabetes, and Thyroid-Disease. The performance of the proposed algorithm is analyzed by comparing with other training methods based on Artificial Bee Colony Optimization (ABC), Particle Swarm Optimization (PSO), Simulated Annealing (SA), Genetic Algorithm (GA) and Stochastic Gradient Descent (SGD) algorithms. The experimental results show that VS-FNN is generally leading and competitive. It is also said that VS-FNN can be used as a capable tool for neural networks.
引用
收藏
页码:1517 / 1544
页数:27
相关论文
共 50 条
  • [21] Optimal Output Gain Algorithm for Feed-Forward Network Training
    Aswathappa, Babu Hemanth Kumar
    Manry, M. T.
    Rawat, Rohit
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 2609 - 2616
  • [22] AutoClustering: A Feed-Forward Neural Network Based Clustering Algorithm
    Kimura, Masaomi
    2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 659 - 666
  • [23] Filtering Training Data When Training Feed-Forward Artificial Neural Network
    Moniz, Krishna
    Yuan, Yuyu
    TRUSTWORTHY COMPUTING AND SERVICES, 2014, 426 : 212 - 218
  • [24] On Performance of Marine Predators Algorithm in Training of Feed-Forward Neural Network for Identification of Nonlinear Systems
    Kaya, Ceren Bastemur
    SYMMETRY-BASEL, 2023, 15 (08):
  • [25] Salp Swarm Algorithm (SSA) for Training Feed-Forward Neural Networks
    Bairathi, Divya
    Gopalani, Dinesh
    SOFT COMPUTING FOR PROBLEM SOLVING, SOCPROS 2017, VOL 1, 2019, 816 : 521 - 534
  • [26] Dynamic group optimisation algorithm for training feed-forward neural networks
    Tang, Rui
    Fong, Simon
    Deb, Suash
    Vasilakos, Athanasios V.
    Millham, Richard C.
    NEUROCOMPUTING, 2018, 314 : 1 - 19
  • [27] Hybrid training of feed-forward neural networks with particle swarm optimization
    Carvalho, M.
    Ludermir, T. B.
    NEURAL INFORMATION PROCESSING, PT 2, PROCEEDINGS, 2006, 4233 : 1061 - 1070
  • [28] A hybrid blind signal separation algorithm: Particle swarm optimization on feed-forward neural network
    Liu, Chan-Cheng
    Sun, Tsung-Ying
    Hsieh, Sheng-Ta
    Lin, Chun-Ling
    Lee, Kan-Yuan
    NEURAL INFORMATION PROCESSING, PT 1, PROCEEDINGS, 2006, 4232 : 1078 - 1087
  • [29] Combining a gravitational search algorithm, particle swarm optimization, and fuzzy rules to improve the classification performance of a feed-forward neural network
    Huang, Mei-Ling
    Chou, Yueh-Ching
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2019, 180
  • [30] A modified hidden weight optimization algorithm for feed-forward neural networks
    Yu, CH
    Manry, MT
    THIRTY-SIXTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS - CONFERENCE RECORD, VOLS 1 AND 2, CONFERENCE RECORD, 2002, : 1034 - 1038