Filter-based unsupervised feature selection using Hilbert–Schmidt independence criterion

被引:0
|
作者
Samaneh Liaghat
Eghbal G. Mansoori
机构
[1] Shiraz University,School of Electrical and Computer Engineering
来源
International Journal of Machine Learning and Cybernetics | 2019年 / 10卷
关键词
Feature selection; Kernel methods; Independence measure; Hilbert–Schmidt independence criterion;
D O I
暂无
中图分类号
学科分类号
摘要
Feature selection is a fundamental preprocess before performing actual learning; especially in unsupervised manner where the data are unlabeled. Essentially, when there are too many features in the problem, dimensionality reduction through discarding weak features is highly desirable. In this paper, we present a framework for unsupervised feature selection based on dependency maximization between the samples similarity matrices before and after deleting a feature. In this regard, a novel estimation of Hilbert–Schmidt independence criterion (HSIC), more appropriate for high-dimensional data with small sample size, is introduced. Its key idea is that by eliminating the redundant features and/or those have high inter-relevancy, the pairwise samples similarity is not affected seriously. Also, to handle the diagonally dominant matrices, a heuristic trick is used in order to reduce the dynamic range of matrix values. In order to speed up the proposed scheme, the gap statistic and k-means clustering methods are also employed. To assess the performance of our method, some experiments on benchmark datasets are conducted. The obtained results confirm the efficiency of our unsupervised feature selection scheme.
引用
收藏
页码:2313 / 2328
页数:15
相关论文
共 50 条
  • [21] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [22] Extending Hilbert-Schmidt Independence Criterion for Testing Conditional Independence
    Zhang, Bingyuan
    Suzuki, Joe
    ENTROPY, 2023, 25 (03)
  • [23] Multi-label Feature Selection Method Combining Unbiased Hilbert-Schmidt Independence Criterion with Controlled Genetic Algorithm
    Liu, Chang
    Ma, Quan
    Xu, Jianhua
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT IV, 2018, 11304 : 3 - 14
  • [24] A filter-based bare-bone particle swarm optimization algorithm for unsupervised feature selection
    Zhang, Yong
    Li, Hai-Gang
    Wang, Qing
    Peng, Chao
    APPLIED INTELLIGENCE, 2019, 49 (08) : 2889 - 2898
  • [25] A filter-based bare-bone particle swarm optimization algorithm for unsupervised feature selection
    Yong Zhang
    Hai-Gang Li
    Qing Wang
    Chao Peng
    Applied Intelligence, 2019, 49 : 2889 - 2898
  • [26] Sensitivity Analysis for ReaxFF Reparametrization Using the Hilbert-Schmidt Independence Criterion
    Gustavo, Michael Freitas
    Hellstroem, Matti
    Verstraelen, Toon
    JOURNAL OF CHEMICAL THEORY AND COMPUTATION, 2023, 19 (09) : 2557 - 2573
  • [27] Blind Facial Basis Discovery Using the Hilbert-Schmidt Independence Criterion
    Ma, Wan-Duo Kurt
    Lewis, J. P.
    Kleijn, W. Bastiaan
    2018 INTERNATIONAL CONFERENCE ON IMAGE AND VISION COMPUTING NEW ZEALAND (IVCNZ), 2018,
  • [28] Local tangent space alignment based on Hilbert–Schmidt independence criterion regularization
    Xinghua Zheng
    Zhengming Ma
    Lei Li
    Pattern Analysis and Applications, 2020, 23 : 855 - 868
  • [29] Improved Filter-Based Feature Selection Using Correlation and Clustering Techniques
    Atmakuru, Akhila
    Di Fatta, Giuseppe
    Nicosia, Giuseppe
    Badii, Atta
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, LOD 2023, PT I, 2024, 14505 : 379 - 389
  • [30] A filter-based feature construction and feature selection approach for classification using Genetic Programming
    Ma, Jianbin
    Gao, Xiaoying
    KNOWLEDGE-BASED SYSTEMS, 2020, 196