Subspace learning for feature selection via rank revealing QR factorization: Fast feature selection

被引:3
|
作者
Moslemi, Amir [1 ]
Ahmadian, Arash [2 ]
机构
[1] Seneca Polytech, Sch Software Design & Data Sci, Toronto, ON, Canada
[2] Univ Toronto, Edward S Rogers Sr Dept Elect & Comp Engn, Toronto, ON M5S 1A1, Canada
关键词
Feature selection; Rank revealing QR factorization; Non-negative matrix factorization; Genetic algorithm and hybrid feature selection; UNSUPERVISED FEATURE-SELECTION; SUPERVISED FEATURE-SELECTION; MATRIX FACTORIZATION; MUTUAL INFORMATION; CLASSIFICATION; OPTIMIZATION; ALGORITHMS; APPROXIMATION; REDUCTION; PATTERNS;
D O I
10.1016/j.eswa.2024.124919
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The identification of informative and distinguishing features from high-dimensional data has gained significant attention in the field of machine learning. Recently, there has been growing interest in employing matrix factorization-based techniques, such as non-negative matrix factorization, for feature selection. The primary objective of feature selection using matrix factorization is to extract a lower-dimensional subspace that captures the essence of the original space. This study introduces a novel unsupervised feature selection technique that leverages rank revealing QR (RRQR) factorization. Compared to singular value decomposition (SVD) and nonnegative matrix factorization (NMF), RRQR is more computationally efficient. The uniqueness of this technique lies in the utilization of the permutation matrix of QR for feature selection. Additionally, we integrate QR factorization into the objective function of NMF to create a new unsupervised feature selection method. Furthermore, we propose a hybrid feature selection algorithm by combining RRQR and a Genetic algorithm. The algorithm eliminates redundant features using RRQR factorization and selects the most distinguishing subset of features using the Genetic algorithm. Experimental comparisons with state-of-the-art feature selection algorithms in supervised, unsupervised, and semi-supervised settings demonstrate the reliability and robustness of the proposed algorithm. The evaluation is conducted on eight microarray datasets using KNN, SVM, and C4.5 classifiers. The experimental results indicate that the proposed method achieves comparable performance to the state-of-the-art feature selection methods. Our empirical findings demonstrate that the proposed method exhibits a significantly lower computational cost compared to other techniques.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Unsupervised feature selection with graph learning via low-rank constraint
    Lu, Guangquan
    Li, Bo
    Yang, Weiwei
    Yin, Jian
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (22) : 29531 - 29549
  • [22] Unsupervised feature selection with graph learning via low-rank constraint
    Guangquan Lu
    Bo Li
    Weiwei Yang
    Jian Yin
    Multimedia Tools and Applications, 2018, 77 : 29531 - 29549
  • [23] Unsupervised feature selection via low-rank approximation and structure learning
    Wang, Shiping
    Wang, Han
    KNOWLEDGE-BASED SYSTEMS, 2017, 124 : 70 - 79
  • [24] Meta learning application in rank aggregation feature selection
    Smetannikov, Ivan
    Deyneka, Alexander
    Filchenkov, Andrey
    2016 3RD INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2016), 2016, : 120 - 123
  • [25] Feature Selection for Analogy-Based Learning to Rank
    Fahandar, Mohsen Ahmadi
    Huellermeier, Eyke
    DISCOVERY SCIENCE (DS 2019), 2019, 11828 : 279 - 289
  • [26] Integrating joint feature selection into subspace learning: A formulation of 2DPCA for outliers robust feature selection
    Razzak, Imran
    Abu Saris, Raghib
    Blumenstein, Michael
    Xu, Guandong
    NEURAL NETWORKS, 2020, 121 : 441 - 451
  • [27] Multi-label Robust Feature Selection via Subspace-Sparsity Learning
    Zhou, Yunya
    Yuan, Bin
    Zhong, Yan
    Li, Yuling
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT I, 2024, 15016 : 3 - 17
  • [28] Subspace Sparse Discriminative Feature Selection
    Nie, Feiping
    Wang, Zheng
    Tian, Lai
    Wang, Rong
    Li, Xuelong
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (06) : 4221 - 4233
  • [29] Greedy Feature Selection for Subspace Clustering
    Dyer, Eva L.
    Sankaranarayanan, Aswin C.
    Baraniuk, Richard G.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 2487 - 2517
  • [30] Feature Selection Embedded Subspace Clustering
    Peng, Chong
    Kang, Zhao
    Yang, Ming
    Cheng, Qiang
    IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (07) : 1018 - 1022