An improved butterfly optimization algorithm for training the feed-forward artificial neural networks

被引:11
|
作者
Irmak, Busra [1 ]
Karakoyun, Murat [1 ]
Gulcu, Saban [1 ]
机构
[1] Necmettin Erbakan Univ, Dept Comp Engn, Konya, Turkey
关键词
Artificial neural networks; Butterfly optimization algorithm; Chaos; Multilayer perceptron; Training artificial neural networks;
D O I
10.1007/s00500-022-07592-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial neural network (ANN) which is an information processing technique developed by modeling the nervous system of the human brain is one of the most powerful learning methods today. One of the factors that make ANN successful is its training algorithm. In this paper, an improved butterfly optimization algorithm (IBOA) based on the butterfly optimization algorithm was proposed for training the feed-forward artificial neural networks. The IBOA algorithm has the chaotic property which helps optimization algorithms to explore the search space more dynamically and globally. In the experiments, ten chaotic maps were used. The success of the IBOA algorithm was tested on 13 benchmark functions which are well known to those working on global optimization and are frequently used for testing and analysis of optimization algorithms. The Tent-mapped IBOA algorithm outperformed the other algorithms in most of the benchmark functions. Moreover, the success of the IBOA-MLP algorithm also has been tested on five classification datasets (xor, balloon, iris, breast cancer, and heart) and the IBOA-MLP algorithm was compared with four algorithms in the literature. According to the statistical performance metrics (sensitivity, specificity, precision, F1-score, and Friedman test), the IBOA-MLP outperformed the other algorithms and proved to be successful in training the feed-forward artificial neural networks.
引用
收藏
页码:3887 / 3905
页数:19
相关论文
共 50 条
  • [21] Dynamic group optimisation algorithm for training feed-forward neural networks
    Tang, Rui
    Fong, Simon
    Deb, Suash
    Vasilakos, Athanasios V.
    Millham, Richard C.
    NEUROCOMPUTING, 2018, 314 : 1 - 19
  • [22] Hybrid training of feed-forward neural networks with particle swarm optimization
    Carvalho, M.
    Ludermir, T. B.
    NEURAL INFORMATION PROCESSING, PT 2, PROCEEDINGS, 2006, 4233 : 1061 - 1070
  • [23] A modified hidden weight optimization algorithm for feed-forward neural networks
    Yu, CH
    Manry, MT
    THIRTY-SIXTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS - CONFERENCE RECORD, VOLS 1 AND 2, CONFERENCE RECORD, 2002, : 1034 - 1038
  • [24] Ear recognition with feed-forward artificial neural networks
    Sibai, Fadi N.
    Nuaimi, Amna
    Maamari, Amna
    Kuwair, Rasha
    NEURAL COMPUTING & APPLICATIONS, 2013, 23 (05): : 1265 - 1273
  • [25] Ear recognition with feed-forward artificial neural networks
    Fadi N. Sibai
    Amna Nuaimi
    Amna Maamari
    Rasha Kuwair
    Neural Computing and Applications, 2013, 23 : 1265 - 1273
  • [26] Feed-forward artificial neural networks: Applications to spectroscopy
    Cirovic, DA
    TRAC-TRENDS IN ANALYTICAL CHEMISTRY, 1997, 16 (03) : 148 - 155
  • [27] A modified weighted chimp optimization algorithm for training feed-forward neural network
    Atta, Eman A.
    Ali, Ahmed F.
    Elshamy, Ahmed A.
    PLOS ONE, 2023, 18 (03):
  • [28] A new scheme for training feed-forward neural networks
    AbdelWahhab, O
    SidAhmed, MA
    PATTERN RECOGNITION, 1997, 30 (03) : 519 - 524
  • [29] Training of the feed forward artificial neural networks using dragonfly algorithm
    Gulcu, Saban
    APPLIED SOFT COMPUTING, 2022, 124
  • [30] Training Feed-Forward Multi-Layer Perceptron Artificial Neural Networks with a Tree-Seed Algorithm
    Cinar, Ahmet Cevahir
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2020, 45 (12) : 10915 - 10938