An improved butterfly optimization algorithm for training the feed-forward artificial neural networks

被引:11
|
作者
Irmak, Busra [1 ]
Karakoyun, Murat [1 ]
Gulcu, Saban [1 ]
机构
[1] Necmettin Erbakan Univ, Dept Comp Engn, Konya, Turkey
关键词
Artificial neural networks; Butterfly optimization algorithm; Chaos; Multilayer perceptron; Training artificial neural networks;
D O I
10.1007/s00500-022-07592-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial neural network (ANN) which is an information processing technique developed by modeling the nervous system of the human brain is one of the most powerful learning methods today. One of the factors that make ANN successful is its training algorithm. In this paper, an improved butterfly optimization algorithm (IBOA) based on the butterfly optimization algorithm was proposed for training the feed-forward artificial neural networks. The IBOA algorithm has the chaotic property which helps optimization algorithms to explore the search space more dynamically and globally. In the experiments, ten chaotic maps were used. The success of the IBOA algorithm was tested on 13 benchmark functions which are well known to those working on global optimization and are frequently used for testing and analysis of optimization algorithms. The Tent-mapped IBOA algorithm outperformed the other algorithms in most of the benchmark functions. Moreover, the success of the IBOA-MLP algorithm also has been tested on five classification datasets (xor, balloon, iris, breast cancer, and heart) and the IBOA-MLP algorithm was compared with four algorithms in the literature. According to the statistical performance metrics (sensitivity, specificity, precision, F1-score, and Friedman test), the IBOA-MLP outperformed the other algorithms and proved to be successful in training the feed-forward artificial neural networks.
引用
收藏
页码:3887 / 3905
页数:19
相关论文
共 50 条
  • [31] Optimizing and Learning Algorithm for Feed-forward Neural Networks
    Bachiller, Pilar
    González, Julia
    Journal of Advanced Computational Intelligence and Intelligent Informatics, 2001, 5 (01) : 51 - 57
  • [32] Training Feed-Forward Multi-Layer Perceptron Artificial Neural Networks with a Tree-Seed Algorithm
    Ahmet Cevahir Cinar
    Arabian Journal for Science and Engineering, 2020, 45 : 10915 - 10938
  • [33] Feed-forward neural networks
    Bebis, George
    Georgiopoulos, Michael
    IEEE Potentials, 1994, 13 (04): : 27 - 31
  • [34] An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training
    Socha, Krzysztof
    Blum, Christian
    NEURAL COMPUTING & APPLICATIONS, 2007, 16 (03): : 235 - 247
  • [35] An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training
    Krzysztof Socha
    Christian Blum
    Neural Computing and Applications, 2007, 16 : 235 - 247
  • [36] A new training algorithm for feed-forward neural networks with application to the XOR classification problem
    Yu, J.
    Xing, J.
    Xiao, D.
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2006, 13E : 1997 - 2000
  • [37] A Systematic Algorithm to Escape from Local Minima in Training Feed-Forward Neural Networks
    Cheung, Chi-Chung
    Xu, Sean Shensheng
    Ng, Sin-Chun
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 396 - 402
  • [38] Fractional activation functions in feed-forward artificial neural networks
    Ivanov, Alexander
    2018 20TH INTERNATIONAL SYMPOSIUM ON ELECTRICAL APPARATUS AND TECHNOLOGIES (SIELA), 2018,
  • [39] Multilayer feed-forward artificial neural networks for class modeling
    Marini, Federico
    Magri, Antonio L.
    Bucci, Remo
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2007, 88 (01) : 118 - 124
  • [40] Multilayer feed-forward artificial neural networks for class modeling
    Marini, Federico
    Magrì, Antonio L.
    Bucci, Remo
    Chemometrics and Intelligent Laboratory Systems, 2007, 87 (01) : 43 - 49