Feature Learning With a Divergence-Encouraging Autoencoder for Imbalanced Data Classification

被引:5
|
作者
Luo, Ruisen [1 ]
Feng, Qian [1 ]
Wang, Chen [1 ,2 ]
Yang, Xiaomei [1 ]
Tu, Haiyan [1 ]
Yu, Qin [1 ]
Fei, Shaomin [3 ,4 ]
Gong, Xiaofeng [1 ]
机构
[1] Sichuan Univ, Coll Elect Engn & Informat Technol, Chengdu 610064, Sichuan, Peoples R China
[2] UCL, Dept Comp Sci, London WC1E 6BT, England
[3] Chengdu Univ Informat Technol, Expt Ctr Elect, Chengdu 610059, Sichuan, Peoples R China
[4] DaGongBoChuang Corp, Chengdu 610005, Sichuan, Peoples R China
来源
IEEE ACCESS | 2018年 / 6卷
关键词
Imbalanced data classification; autoencoder; divergence loss; convergence analysis; alternating training paradigm;
D O I
10.1109/ACCESS.2018.2879221
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Imbalanced data exists commonly in machine learning classification applications. Popular classification algorithms are based on the assumption that data in different classes are roughly equally distributed; however, extremely skewed data, with instances from one class taking up most of the dataset, is not exceptional in practice. Thus, performance of algorithm often degrades significantly when encountering skewed data. Mitigating the problem caused by imbalanced data has been an open challenge for years, and previous researches mostly have proposed solutions from the perspectives of data re-sampling and algorithm improvement. In this paper, focusing on two-class imbalanced data, we have proposed a novel divergence-encouraging autoencoder (DEA) to explicitly learn features from both of the two classes and have designed an imbalanced data classification algorithm based on the proposed autoencoder. By encouraging maximization of divergence loss between different classes in the bottleneck layer, the proposed DEA can learn features for both majority and minority classes simultaneously. The training procedure of the proposed autoencoder is to alternately optimize reconstruction and divergence losses. After obtaining the features, we directly compute the cosine distances between the training and testing features and compare the median of distances between classes to perform classification. Experimental results illustrate that our algorithm outperform ordinary and loss-sensitive CNN models both in terms of performance evaluation metrics and convergence properties. To the best of our knowledge, this is the first paper proposed to solve the imbalanced data classification problem from the perspective of explicitly learning representations of different classes simultaneously. In addition, designing of the proposed DEA is also an innovative work, which could improve the performance of imbalanced data classification without data re-sampling and benefit future researches in the field.
引用
收藏
页码:70197 / 70211
页数:15
相关论文
共 50 条
  • [41] Feature Analysis for Imbalanced Learning
    Dao Nam Anh
    Bui Duong Hung
    Pham Quang Huy
    Dang Xuan Tho
    JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2020, 24 (05) : 648 - 655
  • [42] Feature Selection in Imbalanced Data
    Kamalov F.
    Thabtah F.
    Leung H.H.
    Annals of Data Science, 2023, 10 (06) : 1527 - 1541
  • [43] Learning misclassification costs for imbalanced classification on gene expression data
    Huijuan Lu
    Yige Xu
    Minchao Ye
    Ke Yan
    Zhigang Gao
    Qun Jin
    BMC Bioinformatics, 20
  • [44] Imbalanced data classification: Using transfer learning and active sampling
    Liu, Yang
    Yang, Guoping
    Qiao, Shaojie
    Liu, Meiqi
    Qu, Lulu
    Han, Nan
    Wu, Tao
    Yuan, Guan
    Peng, Yuzhong
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 117
  • [45] Learning misclassification costs for imbalanced classification on gene expression data
    Lu, Huijuan
    Xu, Yige
    Ye, Minchao
    Yan, Ke
    Gao, Zhigang
    Jin, Qun
    BMC BIOINFORMATICS, 2019, 20 (01)
  • [46] Robust multiclass classification for learning from imbalanced biomedical data
    Phoungphol, Piyaphol
    Zhang, Yanqing
    Zhao, Yichuan
    Tsinghua Science and Technology, 2012, 17 (06) : 619 - 628
  • [47] Sampling Approaches for Imbalanced Data Classification Problem in Machine Learning
    Tyagi, Shivani
    Mittal, Sangeeta
    PROCEEDINGS OF RECENT INNOVATIONS IN COMPUTING, ICRIC 2019, 2020, 597 : 209 - 221
  • [48] Robust Multiclass Classification for Learning from Imbalanced Biomedical Data
    Piyaphol Phoungphol
    TsinghuaScienceandTechnology, 2012, 17 (06) : 619 - 628
  • [49] Clustering-based incremental learning for imbalanced data classification
    Liu, Yuxin
    Du, Guangyu
    Yin, Chenke
    Zhang, Haichao
    Wang, Jia
    KNOWLEDGE-BASED SYSTEMS, 2024, 292
  • [50] An improved weighted extreme learning machine for imbalanced data classification
    Lu, Chengbo
    Ke, Haifeng
    Zhang, Gaoyan
    Mei, Ying
    Xu, Huihui
    MEMETIC COMPUTING, 2019, 11 (01) : 27 - 34