Training Feed-Forward Multi-Layer Perceptron Artificial Neural Networks with a Tree-Seed Algorithm

被引:0
|
作者
Ahmet Cevahir Cinar
机构
[1] Selcuk University,Department of Computer Engineering, Faculty of Technology
关键词
Tree-seed algorithm; Multi-layer perceptron; Training neural network; Artificial neural network; Neural networks; Nature inspired algorithms;
D O I
暂无
中图分类号
学科分类号
摘要
The artificial neural network (ANN) is the most popular research area in neural computing. A multi-layer perceptron (MLP) is an ANN that has hidden layers. Feed-forward (FF) ANN is used for classification and regression commonly. Training of FF MLP ANN is performed by backpropagation (BP) algorithm generally. The main disadvantage of BP is trapping into local minima. Nature-inspired optimizers have some mechanisms escaping from the local minima. Tree-seed algorithm (TSA) is an effective population-based swarm intelligence algorithm. TSA mimics the relationship between trees and their seeds. The exploration and exploitation are controlled by search tendency which is a peculiar parameter of TSA. In this work, we train FF MLP ANN for the first time. TSA is compared with particle swarm optimization, gray wolf optimizer, genetic algorithm, ant colony optimization, evolution strategy, population-based incremental learning, artificial bee colony, and biogeography-based optimization. The experimental results show that TSA is the best in terms of mean classification rates and outperformed the opponents on 18 problems.
引用
收藏
页码:10915 / 10938
页数:23
相关论文
共 50 条
  • [21] Weighting adaptive control of Wiener model based on multi-layer feed-forward neural networks
    Wang, XJ
    Wu, LJ
    Li, XH
    Chen, XB
    PROCEEDINGS OF THE 4TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-4, 2002, : 2987 - 2990
  • [22] Real power transfer capability calculations using multi-layer feed-forward neural networks
    Luo, X
    Patton, AD
    Singh, C
    IEEE TRANSACTIONS ON POWER SYSTEMS, 2000, 15 (02) : 903 - 908
  • [23] Efficient Mixed-Signal Synapse Multipliers for Multi-Layer Feed-Forward Neural Networks
    Youssefi, Bahar
    Mirhassani, Mitra
    Wu, Jonathan
    2016 IEEE 59TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS (MWSCAS), 2016, : 814 - 817
  • [24] A New Adaptive Learning algorithm to train Feed-Forward Multi-layer Neural Networks, Applied on Function Approximation Problem
    Ghorrati, Zahra
    2020 FOURTH IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING (IRC 2020), 2020, : 501 - 505
  • [25] Categorization and effective perceptron learning in feed-forward neural networks
    Waelbroeck, H
    Zertuche, F
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 2000, 33 (33): : 5809 - 5818
  • [26] Comparison of Multi-Layer Perceptron and Cascade Feed-Forward Neural Network for Head-Related Transfer Function Interpolation
    Tamulionis, Mantas
    Serackis, Arturas
    2019 OPEN CONFERENCE OF ELECTRICAL, ELECTRONIC AND INFORMATION SCIENCES (ESTREAM), 2019,
  • [27] Salp Swarm Algorithm (SSA) for Training Feed-Forward Neural Networks
    Bairathi, Divya
    Gopalani, Dinesh
    SOFT COMPUTING FOR PROBLEM SOLVING, SOCPROS 2017, VOL 1, 2019, 816 : 521 - 534
  • [28] Dynamic group optimisation algorithm for training feed-forward neural networks
    Tang, Rui
    Fong, Simon
    Deb, Suash
    Vasilakos, Athanasios V.
    Millham, Richard C.
    NEUROCOMPUTING, 2018, 314 : 1 - 19
  • [29] Statistical modelling of artificial neural networks using the multi-layer perceptron
    Murray Aitkin
    Rob Foxall
    Statistics and Computing, 2003, 13 : 227 - 239
  • [30] Statistical modelling of artificial neural networks using the multi-layer perceptron
    Aitkin, M
    Foxall, R
    STATISTICS AND COMPUTING, 2003, 13 (03) : 227 - 239