Utility metric for unsupervised feature selection

被引:3
|
作者
Villa, Amalia [1 ,2 ]
Narayanan, Abhijith Mundanad [1 ,2 ]
Van Huffel, Sabine [1 ,2 ]
Bertrand, Alexander [1 ,2 ]
Varon, Carolina [1 ,3 ,4 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn ESAT, STADIUS Ctr Dynam Syst Signal Proc & Data Analyt, Leuven, Belgium
[2] KU Leuven Inst AI, Leuven AI, Leuven, Belgium
[3] Delft Univ Technol, Circuits & Syst CAS Grp, Delft, Netherlands
[4] Katholieke Univ Leuven, E Media Res Lab, Campus GroepT, Leuven, Belgium
基金
欧盟地平线“2020”; 欧洲研究理事会;
关键词
Unsupervised feature selection; Dimensionality reduction; Manifold learning; Kernel methods; SUBSET-SELECTION; ALGORITHM;
D O I
10.7717/peerj-cs.477
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection techniques are very useful approaches for dimensionality reduction in data analysis. They provide interpretable results by reducing the dimensions of the data to a subset of the original set of features. When the data lack annotations, unsupervised feature selectors are required for their analysis. Several algorithms for this aim exist in the literature, but despite their large applicability, they can be very inaccessible or cumbersome to use, mainly due to the need for tuning non-intuitive parameters and the high computational demands. In this work, a publicly available ready-to-use unsupervised feature selector is proposed, with comparable results to the state-of-the-art at a much lower computational cost. The suggested approach belongs to the methods known as spectral feature selectors. These methods generally consist of two stages: manifold learning and subset selection. In the first stage, the underlying structures in the high-dimensional data are extracted, while in the second stage a subset of the features is selected to replicate these structures. This paper suggests two contributions to this field, related to each of the stages involved. In the manifold learning stage, the effect of non-linearities in the data is explored, making use of a radial basis function (RBF) kernel, for which an alternative solution for the estimation of the kernel parameter is presented for cases with high-dimensional data. Additionally, the use of a backwards greedy approach based on the least-squares utility metric for the subset selection stage is proposed. The combination of these new ingredients results in the utility metric for unsupervised feature selection U2FS algorithm. The proposed U2FS algorithm succeeds in selecting the correct features in a simulation environment. In addition, the performance of the method on benchmark datasets is comparable to the state-of-the-art, while requiring less computational time. Moreover, unlike the state-of-the-art, U2FS does not require any tuning of parameters.
引用
收藏
页码:1 / 26
页数:26
相关论文
共 50 条
  • [1] Beyond Redundancies: A Metric-Invariant Method for Unsupervised Feature Selection
    Hou, Yuexian
    Zhang, Peng
    Yan, Tingxu
    Li, Wenjie
    Song, Dawei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2010, 22 (03) : 348 - 364
  • [2] Unsupervised Feature Selection with Feature Clustering
    Cheung, Yiu-ming
    Jia, Hong
    2012 IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY (WI-IAT 2012), VOL 1, 2012, : 9 - 15
  • [3] Unsupervised Feature Selection via Metric Fusion and Novel Low-Rank Approximation
    Long, Yin
    Chen, Liang
    Li, Linfeng
    Shi, Rong
    IEEE ACCESS, 2022, 10 : 101474 - 101482
  • [4] Embedded Unsupervised Feature Selection
    Wang, Suhang
    Tang, Jiliang
    Liu, Huan
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 470 - 476
  • [5] Feature selection for unsupervised learning
    Dy, JG
    Brodley, CE
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 5 : 845 - 889
  • [6] Unsupervised Personalized Feature Selection
    Li, Jundong
    Wu, Liang
    Dani, Harsh
    Liu, Huan
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3514 - 3521
  • [7] Feature Selection for Unsupervised Learning
    Adhikary, Jyoti Ranjan
    Murty, M. Narasimha
    NEURAL INFORMATION PROCESSING, ICONIP 2012, PT III, 2012, 7665 : 382 - 389
  • [8] Feature weighting as a tool for unsupervised feature selection
    Panday, Deepak
    de Amorim, Renato Cordeiro
    Lane, Peter
    INFORMATION PROCESSING LETTERS, 2018, 129 : 44 - 52
  • [9] UNSUPERVISED FEATURE SELECTION BASED ON FEATURE RELEVANCE
    Zhang, Feng
    Zhao, Ya-Jun
    Chen, Jun-Fen
    PROCEEDINGS OF 2009 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-6, 2009, : 487 - +
  • [10] Unsupervised feature selection using feature similarity
    Mitra, P
    Murthy, CA
    Pal, SK
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2002, 24 (03) : 301 - 312