A Novel Kernel-based Extreme Learning Machine with Incremental Hidden Layer Nodes

被引:3
|
作者
Min, Mengcan [1 ]
Chen, Xiaofang [1 ]
Lei, Yongxiang [1 ]
Chen, Zhiwen [1 ]
Xie, Yongfang [1 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
来源
IFAC PAPERSONLINE | 2020年 / 53卷 / 02期
基金
中国国家自然科学基金;
关键词
ELM; I-ELM; Kernel function; SD classification;
D O I
10.1016/j.ifacol.2020.12.695
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Extreme learning machine (ELM) is widely used in various fields because of its advantages such as short training time and good generalization performance. The input weights and bias of hidden layer of traditional ELM are generated randomly, and the number of hidden layer nodes is determined by artificial experience. Only by adjusting parameters manually can an appropriate network structure be found. This training method is complex and time-consuming, which increases the workload of workers. To solve this problem, the incremental extreme learning machine (I-ELM) is used to determine the appropriate number of hidden layer nodes and construct a compact network structure in this paper. At the same time, a new hidden layer activation function STR is proposed, which avoids the disadvantages of incomplete output information of hidden layer due to uneven distribution of sample data. The proposed algorithm is evaluated by public data sets and applied to the classification of superheat degree (SD) in aluminum electrolysis industry. The experimental results show that STR activation function has a good learning speed, and the proposed algorithm is superior to the existing SD identification algorithm in terms of accuracy and robustness. Copyright (C) 2020 The Authors.
引用
收藏
页码:11836 / 11841
页数:6
相关论文
共 50 条
  • [41] Application of Singular Spectrum Analysis and Kernel-based Extreme Learning Machine for Stock Price Prediction
    Suksiri, Preuk
    Chiewchanwattana, Sirapat
    Sunat, Khamron
    2016 13TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER SCIENCE AND SOFTWARE ENGINEERING (JCSSE), 2016, : 206 - 211
  • [42] Kernel-Based Machine Learning with Multiple Sources of Information
    Kloft, Marius
    IT-INFORMATION TECHNOLOGY, 2013, 55 (02): : 76 - 80
  • [43] Regularization incremental extreme learning machine with random reduced kernel for regression
    Zhou, Zhiyu
    Chen, Ji
    Zhu, Zefei
    NEUROCOMPUTING, 2018, 321 : 72 - 81
  • [44] Integrated Optimization Method of Hidden Parameters in Incremental Extreme Learning Machine
    Zhang, Siyuan
    Xie, Linbo
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [45] Method of pruning the hidden layer of the extreme learning machine based on correlation coefficient
    de Campos Souza, Paulo Vitor
    Araujo, Vanessa Souza
    Guimaraes, Augusto Junio
    Silva Araujo, Vincius Jonathan
    Rezende, Thiago Silva
    2018 IEEE LATIN AMERICAN CONFERENCE ON COMPUTATIONAL INTELLIGENCE (LA-CCI), 2018,
  • [46] An improved incremental constructive single-hidden-layer feedforward networks for extreme learning machine based on particle swarm optimization
    Han, Fei
    Zhao, Min-Ru
    Zhang, Jian-Ming
    Ling, Qing-Hua
    NEUROCOMPUTING, 2017, 228 : 133 - 142
  • [47] Universal Approximation of Extreme Learning Machine with Adaptive Growth of Hidden Nodes
    Zhang, Rui
    Lan, Yuan
    Huang, Guang-Bin
    Xu, Zong-Ben
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (02) : 365 - 371
  • [48] Deformed Kernel Based Extreme Learning Machine
    Chen, Zhang
    Xiong, Xia Shi
    Bing, Liu
    JOURNAL OF COMPUTERS, 2013, 8 (06) : 1602 - 1609
  • [49] Exploring mutual information-based sentimental analysis with kernel-based extreme learning machine for stock prediction
    Wang, Feng
    Zhang, Yongquan
    Rao, Qi
    Li, Kangshun
    Zhang, Hao
    SOFT COMPUTING, 2017, 21 (12) : 3193 - 3205
  • [50] Exploring mutual information-based sentimental analysis with kernel-based extreme learning machine for stock prediction
    Feng Wang
    Yongquan Zhang
    Qi Rao
    Kangshun Li
    Hao Zhang
    Soft Computing, 2017, 21 : 3193 - 3205