Joint sparse latent representation learning and dual manifold regularization for unsupervised feature selection

被引:2
|
作者
Huang, Mengshi [1 ,2 ,3 ,4 ]
Chen, Hongmei [1 ,2 ,3 ,4 ]
Mi, Yong [1 ,2 ,3 ,4 ]
Luo, Chuan [5 ]
Horng, Shi-Jinn [6 ,7 ]
Li, Tianrui [1 ,2 ,3 ,4 ]
机构
[1] Southwest Jiaotong Univ, Sch Comp & Artificial Intelligence, Chengdu 611756, Peoples R China
[2] Southwest Jiaotong Univ, Natl Engn Lab Integrated Transportat Big Data Appl, Chengdu 611756, Peoples R China
[3] Minist Educ, Engn Res Ctr Sustainable Urban Intelligent Transpo, Chengdu 611756, Peoples R China
[4] Southwest Jiaotong Univ, Mfg Ind Chains Collaborat & Informat Support Tech, Chengdu 611756, Peoples R China
[5] Sichuan Univ, Coll Comp Sci, Chengdu 610065, Peoples R China
[6] Asia Univ, Dept Comp Sci & Informat Engn, Taichung 41354, Taiwan
[7] China Med Univ, China Med Univ Hosp, Dept Med Res, Taichung, Taiwan
关键词
Unsupervised feature selection; Sparse regression; Latent representation learning; Manifold regularization; GRAPH; ENTROPY; IMAGE;
D O I
10.1016/j.knosys.2023.111105
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As an effective dimensionality reduction method, unsupervised feature selection (UFS) focuses on the mutual correlations between high-dimensional data features but often overlooks the intrinsic relationships between instances. We also utilize pseudo-labels learned from the data to guide feature selection in UFS. However, the raw data space may contain noise and outliers, leading to a lower accuracy of the learned pseudo-label matrix. We propose a minimum-redundant UFS approach to tackle these problems through jointing sparse latent representation learning with dual manifold regularization (SLRDR). Firstly, SLRDR learns a subspace of latent representation by exploring the interconnection of original data. To enhance subspace sparsity, l(2),(1)-norm is applied to the residual matrix of latent representation learning. Pseudo-label matrix learning is then carried out in the high-quality latent space, resulting in effective pseudo-label information that can provide more useful guidance for sparse regression. Secondly, based on the manifold learning hypothesis, SLRDR exploits features' local structural properties in feature space and explores the association between data and labels, allowing the model to learn richer and more accurate structural information. In addition, l(2),(1)/(2)-norm is imposed on the weight matrix to obtain a minimum-redundant solution and select more discriminative features. Finally, an alternating iterative method is used for SLRDR to solve the optimization problem of the objective function, and the convergence of the model is theoretically analyzed. Besides, a series of comparative experiments with ten existing algorithms on nine benchmark datasets are used to verify the model's effectiveness.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] Latent Space Embedding for Unsupervised Feature Selection via Joint Dictionary Learning
    Fan, Yang
    Dai, Jianhua
    Zhang, Qilai
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [32] Unsupervised feature selection based on joint spectral learning and general sparse regression
    Tao Chen
    Yanrong Guo
    Shijie Hao
    Neural Computing and Applications, 2020, 32 : 6581 - 6589
  • [33] UNSUPERVISED FEATURE SELECTION BY MANIFOLD REGULARIZED SELF-REPRESENTATION
    Liang, Siqi
    Xu, Qian
    Zhu, Pengfei
    Hu, Qinghua
    Zhang, Changqing
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 2398 - 2402
  • [34] Joint Feature Selection and Extraction With Sparse Unsupervised Projection
    Wang, Jingyu
    Wang, Lin
    Nie, Feiping
    Li, Xuelong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (06) : 3071 - 3081
  • [35] Unsupervised feature selection by self -paced learning regularization
    Zheng, Wei
    Zhu, Xiaofeng
    Wen, Guoqiu
    Zhu, Yonghua
    Yu, Hao
    Gan, Jiangzhang
    PATTERN RECOGNITION LETTERS, 2020, 132 (132) : 4 - 11
  • [36] Approximate Policy Iteration with Unsupervised Feature Learning based on Manifold Regularization
    Li, Hongliang
    Liu, Derong
    Wang, Ding
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [37] Latent low-rank representation sparse regression model with symmetric constraint for unsupervised feature selection
    Guo, Lingli
    Chen, Xiuhong
    IET IMAGE PROCESSING, 2023, 17 (09) : 2791 - 2805
  • [38] SPARSE REPRESENTATION-BASED APPROACH FOR UNSUPERVISED FEATURE SELECTION
    Su, Ya-Ru
    Li, Chuan-Xi
    Wang, Ru-Jing
    Chen, Peng
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2014, 28 (03)
  • [39] Joint Dictionary Learning for Unsupervised Feature Selection
    Fan, Yang
    Dai, Jianhua
    Zhang, Qilai
    Liu, Shuai
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 : 46 - 58
  • [40] UNSUPERVISED FEATURE SELECTION BY JOINT GRAPH LEARNING
    Zhang, Zhihong
    Xiahou, Jianbing
    Liang, Yuanheng
    Chen, Yuhan
    2015 IEEE CHINA SUMMIT & INTERNATIONAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING, 2015, : 554 - 558