Discriminative and Robust Autoencoders for Unsupervised Feature Selection

被引:3
|
作者
Ling, Yunzhi [1 ,2 ]
Nie, Feiping [1 ,2 ,3 ]
Yu, Weizhong [2 ]
Li, Xuelong [2 ,3 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Shaanxi, Peoples R China
[2] Northwestern Polytech Univ, Sch Artificial Intelligence Opt & Elect iOPEN, Xian 710072, Shaanxi, Peoples R China
[3] Northwestern Polytech Univ, Key Lab Intelligent Interact & Applicat, Minist Ind & Informat Technol, Xian 710072, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Data models; Robustness; Representation learning; Optics; Clustering algorithms; Training data; Autoencoders (AEs); clustering; feature selection; neural networks; robustness; unsupervised learning; CLASSIFICATION;
D O I
10.1109/TNNLS.2023.3333737
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many recent research works on unsupervised feature selection (UFS) have focused on how to exploit autoencoders (AEs) to seek informative features. However, existing methods typically employ the squared error to estimate the data reconstruction, which amplifies the negative effect of outliers and can lead to performance degradation. Moreover, traditional AEs aim to extract latent features that capture intrinsic information of the data for accurate data recovery. Without incorporating explicit cluster structure-detecting objectives into the training criterion, AEs fail to capture the latent cluster structure of the data which is essential for identifying discriminative features. Thus, the selected features lack strong discriminative power. To address the issues, we propose to jointly perform robust feature selection and k-means clustering in a unified framework. Concretely, we exploit an AE with a l2,1-norm as a basic model to seek informative features. To improve robustness against outliers, we introduce an adaptive weight vector for the data reconstruction terms of AE, which assigns smaller weights to the data with larger errors to automatically reduce the influence of the outliers, and larger weights to the data with smaller errors to strengthen the influence of clean data. To enhance the discriminative power of the selected features, we incorporate k-means clustering into the representation learning of the AE. This allows the AE to continually explore cluster structure information, which can be used to discover more discriminative features. Then, we also present an efficient approach to solve the objective of the corresponding problem. Extensive experiments on various benchmark datasets are provided, which clearly demonstrate that the proposed method outperforms state-of-the-art methods.
引用
收藏
页码:1622 / 1636
页数:15
相关论文
共 50 条
  • [1] Exploring Autoencoders for Unsupervised Feature Selection
    Chandra, B.
    Sharma, Rajesh K.
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [2] Discriminative embedded unsupervised feature selection
    Zhu, Qi-Hai
    Yang, Yu-Bin
    PATTERN RECOGNITION LETTERS, 2018, 112 : 219 - 225
  • [3] Unsupervised Discriminative Projection for Feature Selection
    Wang, Rong
    Bian, Jintang
    Nie, Feiping
    Li, Xuelong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (02) : 942 - 953
  • [4] Differentiable gated autoencoders for unsupervised feature selection
    Chen, Zebin
    Bian, Jintang
    Qiao, Bo
    Xie, Xiaohua
    NEUROCOMPUTING, 2024, 601
  • [5] UNSUPERVISED FEATURE RANKING AND SELECTION BASED ON AUTOENCODERS
    Sharifipour, Sasan
    Fayyazi, Hossein
    Sabokrou, Mohammad
    Adeli, Ehsan
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3172 - 3176
  • [6] Local and Global Discriminative Learning for Unsupervised Feature Selection
    Du, Liang
    Shen, Zhiyong
    Li, Xuan
    Zhou, Peng
    Shen, Yi-Dong
    2013 IEEE 13TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2013, : 131 - 140
  • [7] Unsupervised Robust Bayesian Feature Selection
    Sun, Jianyong
    Zhou, Aimin
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 558 - 564
  • [8] Robust autoencoder feature selector for unsupervised feature selection
    Ling, Yunzhi
    Nie, Feiping
    Yu, Weizhong
    Ling, Yunhao
    Li, Xuelong
    INFORMATION SCIENCES, 2024, 660
  • [9] Unsupervised Discriminative Feature Selection via Contrastive Graph Learning
    Zhou, Qian
    Wang, Qianqian
    Gao, Quanxue
    Yang, Ming
    Gao, Xinbo
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 972 - 986
  • [10] Unsupervised feature selection for tumor profiles using autoencoders and kernel methods
    Palazzo, Martin
    Beauseroy, Pierre
    Yankilevich, Patricio
    2020 IEEE CONFERENCE ON COMPUTATIONAL INTELLIGENCE IN BIOINFORMATICS AND COMPUTATIONAL BIOLOGY (CIBCB), 2020, : 187 - 194