A Construction of Robust Representations for Small Data Sets Using Broad Learning System

被引:17
|
作者
Tang, Huimin [1 ,2 ]
Dong, Peiwu [1 ]
Shi, Yong [2 ,3 ,4 ,5 ]
机构
[1] Beijing Inst Technol, Sch Management & Econ, Beijing 100081, Peoples R China
[2] Univ Nebraska, Coll Informat Sci & Technol, Omaha, NE 68182 USA
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] Southwest Minzu Univ, Coll Elect & Informat Engn, Chengdu 610041, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Machine learning; Zinc; Neural networks; Learning systems; Task analysis; Data models; Broad learning system (BLS); feature extraction; label-based autoencoder (LA); random LA (RLA); robust representation; FEATURE-SELECTION; MACHINE; IDENTIFICATION; APPROXIMATION; AUTOENCODER; PCA;
D O I
10.1109/TSMC.2019.2957818
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature processing is an important step for modeling and can improve the accuracy of machine learning models. Feature extraction methods can effectively extract features from high-dimensional data sets and enhance the accuracy of tasks. However, the performance of feature extraction methods is not stable in low-dimensional data sets. This article extends the broad learning system (BLS) to a framework for constructing robust representations in low-dimensional and small data sets. First, the BLS changed from a supervised prediction method to an ensemble feature extraction method. Second, feature extraction methods instead of random mapping are used to generate mapped features. Third, deep representations, called enhancement features, are learned from the ensemble mapped features. Fourth, data for generating mapped features and enhancement features can be randomly selected. The ensemble of mapped features and enhancement features can provide robust representations to enhance the performance of downstream tasks. A label-based autoencoder (LA) is embedded in the BLS framework as an example to show the effectiveness of the framework. A random LA (RLA) is presented to generate more different features. The experimental results show that the BLS framework can construct robust representations and significantly promote the performance of machine learning models.
引用
收藏
页码:6074 / 6084
页数:11
相关论文
共 50 条
  • [1] Robust Broad Learning System or Uncertain Data Modeling
    Jin, Junwei
    Chen, C. L. Philip
    Li, Yanting
    2018 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2018, : 3524 - 3529
  • [2] Tree Broad Learning System for Small Data Modeling
    Xia, Heng
    Tang, Jian
    Yu, Wen
    Qiao, Junfei
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 8909 - 8923
  • [3] Regularized robust Broad Learning System for uncertain data modeling
    Jin, Jun-Wei
    Chen, C. L. Philip
    NEUROCOMPUTING, 2018, 322 : 58 - 69
  • [4] A Novel Robust Stacked Broad Learning System for Noisy Data Regression
    Zheng, Kai
    Liu, Jie
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (02) : 492 - 498
  • [5] Robust Incremental Broad Learning System for Data Streams of Uncertain Scale
    Zhong, Linjun
    Chen, C. L. Philip
    Guo, Jifeng
    Zhang, Tong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 14
  • [6] Frankenstein: Learning Deep Face Representations Using Small Data
    Hu, Guosheng
    Peng, Xiaojiang
    Yang, Yongxin
    Hospedales, Timothy M.
    Verbeek, Jakob
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (01) : 293 - 303
  • [7] Towards robust and generalizable representations of extracellular data using contrastive learning
    Vishnubhotla, Ankit
    Loh, Charlotte
    Paninski, Liam
    Srivastava, Akash
    Hurwitz, Cole
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [8] Intention Understanding in Small Training Data Sets by Using Transfer Learning
    Joko, Hideaki
    Ucihde, Hayato
    Koji, Yusuke
    Otsuka, Takahiro
    2018 ELEVENTH INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND UBIQUITOUS NETWORK (ICMU 2018), 2018,
  • [9] Extracting and Composing Robust Features With Broad Learning System
    Yang, Kaixiang
    Liu, Yuchen
    Yu, Zhiwen
    Chen, C. L. Philip
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (04) : 3885 - 3896
  • [10] Online robust echo state broad learning system
    Guo, Yu
    Yang, Xiaoxiao
    Wang, Yinuo
    Wang, Fei
    Chen, Badong
    NEUROCOMPUTING, 2021, 464 : 438 - 449