Progressive Ensemble Kernel-Based Broad Learning System for Noisy Data Classification

被引:24
|
作者
Yu, Zhiwen [1 ]
Lan, Kankan [1 ]
Liu, Zhulin [1 ]
Han, Guoqiang [1 ]
机构
[1] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510640, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Kernel; Learning systems; Noise measurement; Feature extraction; Training; Biological neural networks; Uncertainty; Broad learning system (BLS); ensemble learning; kernel learning; noisy data; RIDGE-REGRESSION; NEURAL-NETWORK; MACHINE; MODEL; REPRESENTATIONS; APPROXIMATION; RECOGNITION; CLASSIFIERS; SELECTION;
D O I
10.1109/TCYB.2021.3064821
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The broad learning system (BLS) is an algorithm that facilitates feature representation learning and data classification. Although weights of BLS are obtained by analytical computation, which brings better generalization and higher efficiency, BLS suffers from two drawbacks: 1) the performance depends on the number of hidden nodes, which requires manual tuning, and 2) double random mappings bring about the uncertainty, which leads to poor resistance to noise data, as well as unpredictable effects on performance. To address these issues, a kernel-based BLS (KBLS) method is proposed by projecting feature nodes obtained from the first random mapping into kernel space. This manipulation reduces the uncertainty, which contributes to performance improvements with the fixed number of hidden nodes, and indicates that manually tuning is no longer needed. Moreover, to further improve the stability and noise resistance of KBLS, a progressive ensemble framework is proposed, in which the residual of the previous base classifiers is used to train the following base classifier. We conduct comparative experiments against the existing state-of-the-art hierarchical learning methods on multiple noisy real-world datasets. The experimental results indicate our approaches achieve the best or at least comparable performance in terms of accuracy.
引用
收藏
页码:9656 / 9669
页数:14
相关论文
共 50 条
  • [1] Distributed Text Classification With an Ensemble Kernel-Based Learning Approach
    Silva, Catarina
    Lotric, Uros
    Ribeiro, Bernardete
    Dobnikar, Andrej
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 2010, 40 (03): : 287 - 297
  • [2] Ensemble Kernel-Based Broad Learning System For Fast Gas Recognition in Electronic Nose Systems
    Li, Wang
    Zhao, LinJu
    Wang, Shun
    THIRD INTERNATIONAL CONFERENCE ON SENSORS AND INFORMATION TECHNOLOGY, ICSI 2023, 2023, 12699
  • [3] GENERALIZED OPTIMAL KERNEL-BASED ENSEMBLE LEARNING FOR HYPERSPECTRAL CLASSIFICATION PROBLEMS
    Gurram, Prudhvi
    Kwon, Heesung
    2011 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2011, : 4431 - 4434
  • [4] Kernel-based distance metric learning for microarray data classification
    Xiong, Huilin
    Chen, Xue-wen
    BMC BIOINFORMATICS, 2006, 7 (1)
  • [5] Sparse Kernel-Based Ensemble Learning With Fully Optimized Kernel Parameters for Hyperspectral Classification Problems
    Gurram, Prudhvi
    Kwon, Heesung
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2013, 51 (02): : 787 - 802
  • [6] Kernel-Based Ensemble Learning in Python']Python
    Guedj, Benjamin
    Desikan, Bhargav Srinivasa
    INFORMATION, 2020, 11 (02)
  • [7] Kernel-based distance metric learning for microarray data classification
    Huilin Xiong
    Xue-wen Chen
    BMC Bioinformatics, 7
  • [8] Learning Rates of Kernel-Based Robust Classification
    Wang, Shuhua
    Sheng, Baohuai
    ACTA MATHEMATICA SCIENTIA, 2022, 42 (03) : 1173 - 1190
  • [9] Learning Rates of Kernel-Based Robust Classification
    Shuhua Wang
    Baohuai Sheng
    Acta Mathematica Scientia, 2022, 42 : 1173 - 1190
  • [10] Kernel-based linear classification on categorical data
    Chen, Lifei
    Ye, Yanfang
    Guo, Gongde
    Zhu, Jianping
    SOFT COMPUTING, 2016, 20 (08) : 2981 - 2993