Unsupervised Pre-training Classifier Based on Restricted Boltzmann Machine with Imbalanced Data

被引:0
|
作者
Fu, Xiaoyang [1 ]
机构
[1] Jilin Univ, Zhuhai Coll, Minist Educ, Dept Comp Sci & Technol,Zhuhai Key Lab Symbol Com, Zhuhai 519041, Peoples R China
来源
SMART COMPUTING AND COMMUNICATION, SMARTCOM 2016 | 2017年 / 10135卷
关键词
Semi-supervised learning; Classification; Deep learning; Restricted boltzmann machine; Deep neural network;
D O I
10.1007/978-3-319-52015-5_11
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Many learning algorithms can suffer from a performance bias for classification with imbalanced data. This paper proposes the pre-training the deep structure neural network by restricted Boltzmann machine (RBM) learning algorithm, which is pre-sampled with standard SMOTE methods for imbalanced data classification. Firstly, a new training data set can be generated by a pre-sampling method from original examples; secondly the deep neural network structure is trained on the sampled data and all unlabelled data sets by RBM greedy algorithm, which is called "coarse tuning". Then the neural networks are fined tuned by BP algorithm. The effectiveness of the RBM pre-training neural network (RBMPT) classifier is demonstrated on a number of benchmark data sets. Compared with only BP classifier, pre-sampling BP classifier and RBMPT classifier, it has shown that pre-training procedure can learn more representations of data better with unlabelled data and has better classification performance for classification with imbalanced data sets.
引用
收藏
页码:102 / 110
页数:9
相关论文
共 50 条
  • [1] Unsupervised Pre-Training of Imbalanced Data for Identification of Wafer Map Defect Patterns
    Shon, Ho Sun
    Batbaatar, Erdenebileg
    Cho, Wan-Sup
    Choi, Seong Gon
    IEEE ACCESS, 2021, 9 : 52352 - 52363
  • [2] Finding a good initial configuration of parameters for restricted Boltzmann machine pre-training
    Xie, Chunzhi
    Lv, Jiancheng
    Li, Xiaojie
    SOFT COMPUTING, 2017, 21 (21) : 6471 - 6479
  • [3] Finding a good initial configuration of parameters for restricted Boltzmann machine pre-training
    Chunzhi Xie
    Jiancheng Lv
    Xiaojie Li
    Soft Computing, 2017, 21 : 6471 - 6479
  • [4] Restricted Boltzmann Machines for Pre-training Deep Gaussian Networks
    Eastwood, Mark
    Jayne, Chrisina
    2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [5] Training a Quantum Annealing Based Restricted Boltzmann Machine on Cybersecurity Data
    Dixit, Vivek
    Selvarajan, Raja
    Aldwairi, Tamer
    Koshka, Yaroslav
    Novotny, Mark A.
    Humble, Travis S.
    Alam, Muhammad A.
    Kais, Sabre
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (03): : 417 - 428
  • [6] Explicit Cross-lingual Pre-training for Unsupervised Machine Translation
    Ren, Shuo
    Wu, Yu
    Liu, Shujie
    Zhou, Ming
    Ma, Shuai
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 770 - 779
  • [7] Unsupervised Pre-Training for Detection Transformers
    Dai, Zhigang
    Cai, Bolun
    Lin, Yugeng
    Chen, Junying
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 12772 - 12782
  • [8] Unsupervised Pre-Training for Voice Activation
    Kolesau, Aliaksei
    Sesok, Dmitrij
    APPLIED SCIENCES-BASEL, 2020, 10 (23): : 1 - 13
  • [9] A Study of Speech Recognition for Kazakh Based on Unsupervised Pre-Training
    Meng, Weijing
    Yolwas, Nurmemet
    SENSORS, 2023, 23 (02)
  • [10] TRANSFORMER BASED UNSUPERVISED PRE-TRAINING FOR ACOUSTIC REPRESENTATION LEARNING
    Zhang, Ruixiong
    Wu, Haiwei
    Li, Wubo
    Jiang, Dongwei
    Zou, Wei
    Li, Xiangang
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6933 - 6937