Unsupervised Pre-training Classifier Based on Restricted Boltzmann Machine with Imbalanced Data

被引:0
|
作者
Fu, Xiaoyang [1 ]
机构
[1] Jilin Univ, Zhuhai Coll, Minist Educ, Dept Comp Sci & Technol,Zhuhai Key Lab Symbol Com, Zhuhai 519041, Peoples R China
来源
SMART COMPUTING AND COMMUNICATION, SMARTCOM 2016 | 2017年 / 10135卷
关键词
Semi-supervised learning; Classification; Deep learning; Restricted boltzmann machine; Deep neural network;
D O I
10.1007/978-3-319-52015-5_11
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Many learning algorithms can suffer from a performance bias for classification with imbalanced data. This paper proposes the pre-training the deep structure neural network by restricted Boltzmann machine (RBM) learning algorithm, which is pre-sampled with standard SMOTE methods for imbalanced data classification. Firstly, a new training data set can be generated by a pre-sampling method from original examples; secondly the deep neural network structure is trained on the sampled data and all unlabelled data sets by RBM greedy algorithm, which is called "coarse tuning". Then the neural networks are fined tuned by BP algorithm. The effectiveness of the RBM pre-training neural network (RBMPT) classifier is demonstrated on a number of benchmark data sets. Compared with only BP classifier, pre-sampling BP classifier and RBMPT classifier, it has shown that pre-training procedure can learn more representations of data better with unlabelled data and has better classification performance for classification with imbalanced data sets.
引用
收藏
页码:102 / 110
页数:9
相关论文
共 50 条
  • [41] Pre-Training on Mixed Data for Low-Resource Neural Machine Translation
    Zhang, Wenbo
    Li, Xiao
    Yang, Yating
    Dong, Rui
    INFORMATION, 2021, 12 (03)
  • [42] Quality assessment of view synthesis based on unsupervised quality-aware pre-training
    Shi, Haozhi
    Huang, Yipo
    Wang, Lizhe
    Wang, Lanmei
    APPLIED SOFT COMPUTING, 2024, 154
  • [43] Paired Restricted Boltzmann Machine for Linked Data
    Wang, Suhang
    Tang, Jiliang
    Morstatter, Fred
    Liu, Huan
    CIKM'16: PROCEEDINGS OF THE 2016 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2016, : 1753 - 1762
  • [44] Training Restricted Boltzmann Machine with Dynamic Learning Rate
    Luo, Linkai
    Wang, Yudan
    Peng, Hong
    Tang, Zhimin
    You, Shiyang
    Huang, Xiaoqin
    2016 11TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE & EDUCATION (ICCSE), 2016, : 103 - 107
  • [45] Continuous restricted Boltzmann machine with an implementable training algorithm
    Chen, H
    Murray, AF
    IEE PROCEEDINGS-VISION IMAGE AND SIGNAL PROCESSING, 2003, 150 (03): : 153 - 158
  • [46] Generative and discriminative infinite restricted Boltzmann machine training
    Wang, Qianglong
    Gao, Xiaoguang
    Wan, Kaifang
    Hu, Zijian
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (10) : 7857 - 7887
  • [47] Unsupervised Audio Segmentation based on Restricted Boltzmann Machines
    Pikrakis, Aggelos
    5TH INTERNATIONAL CONFERENCE ON INFORMATION, INTELLIGENCE, SYSTEMS AND APPLICATIONS, IISA 2014, 2014, : 311 - 314
  • [48] Online RBM: Growing Restricted Boltzmann Machine on the fly for unsupervised representation
    Savitha, Ramasamy
    Ambikapathi, ArulMurugan
    Rajaraman, Kanagasabai
    APPLIED SOFT COMPUTING, 2020, 92
  • [49] Multilingual Denoising Pre-training for Neural Machine Translation
    Liu, Yinhan
    Gu, Jiatao
    Goyal, Naman
    Li, Xian
    Edunov, Sergey
    Ghazvininejad, Marjan
    Lewis, Mike
    Zettlemoyer, Luke
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2020, 8 : 726 - 742
  • [50] Curriculum pre-training for stylized neural machine translation
    Zou, Aixiao
    Wu, Xuanxuan
    Li, Xinjie
    Zhang, Ting
    Cui, Fuwei
    Xu, Jinan
    APPLIED INTELLIGENCE, 2024, 54 (17-18) : 7958 - 7968