Unsupervised feature selection via multi-step markov probability relationship

被引:1
|
作者
Min, Yan [1 ]
Ye, Mao [1 ]
Tian, Liang [1 ]
Jian, Yulin [1 ]
Zhu, Ce [2 ]
Yang, Shangming [3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Informat & Commun Engn, Chengdu 611731, Peoples R China
[3] Univ Elect Sci & Technol China, Sch Informat & Software Engn, Chengdu 611731, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Unsupervised feature selection; Data structure preserving; Multi-step Markov transition probability; Machine learning; DIMENSIONALITY REDUCTION; RECOGNITION;
D O I
10.1016/j.neucom.2021.04.073
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is a widely used dimension reduction technique to select feature subsets because of its interpretability. Many methods have been proposed and achieved good results, in which the relationships between adjacent data points are mainly concerned. But the possible associations between data pairs that are not adjacent are always neglected. Different from previous methods, we propose a novel and very simple approach for unsupervised feature selection, named MMFS (Multi-step Markov Probability Relationship for Feature Selection). The idea is using multi-step Markov transition probability to describe the relation between any data pair. Two ways from the positive and negative viewpoints are employed respectively to keep the data structure after feature selection. From the positive viewpoint, the maximum transition probability that can be reached in a certain number of steps is used to describe the relation between two points. Then, the features which can keep the compact data structure are selected. From the viewpoint of negative, the minimum transition probability that can be reached in a certain number of steps is used to describe the relation between two points. On the contrary, the features that least maintain the loose data structure are selected. The two ways can also be combined. Thus three algorithms are proposed. Our main contributions are a novel feature section approach which uses multi-step transition probability to characterize the data structure, and three algorithms proposed from the positive and negative aspects for keeping data structure and select the features to preserve such structure. The performance of our approach is compared with the state-of-the-art methods on eight real-world data sets, and the experimental results show that the proposed MMFS is effective in unsupervised feature selection. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:241 / 253
页数:13
相关论文
共 50 条
  • [41] Unsupervised feature selection via multiple graph fusion and feature weight learning
    Tang, Chang
    Zheng, Xiao
    Zhang, Wei
    Liu, Xinwang
    Zhu, Xinzhong
    Zhu, En
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (05)
  • [42] Unsupervised feature selection via multiple graph fusion and feature weight learning
    Chang TANG
    Xiao ZHENG
    Wei ZHANG
    Xinwang LIU
    Xinzhong ZHU
    En ZHU
    ScienceChina(InformationSciences), 2023, 66 (05) : 56 - 72
  • [43] Two-Dimensional Unsupervised Feature Selection via Sparse Feature Filter
    Li, Junyu
    Chen, Jiazhou
    Qi, Fei
    Dan, Tingting
    Weng, Wanlin
    Zhang, Bin
    Yuan, Haoliang
    Cai, Hongmin
    Zhong, Cheng
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (09) : 5605 - 5617
  • [44] Unsupervised Discriminative Feature Selection via Contrastive Graph Learning
    Zhou, Qian
    Wang, Qianqian
    Gao, Quanxue
    Yang, Ming
    Gao, Xinbo
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 972 - 986
  • [45] Unsupervised Feature Selection via Adaptive Graph Learning and Constraint
    Zhang, Rui
    Zhang, Yunxing
    Li, Xuelong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (03) : 1355 - 1362
  • [46] Unsupervised feature selection via maximum projection and minimum redundancy
    Wang, Shiping
    Pedrycz, Witold
    Zhu, Qingxin
    Zhu, William
    KNOWLEDGE-BASED SYSTEMS, 2015, 75 : 19 - 29
  • [47] Unsupervised feature selection with multi-subspace randomization and collaboration
    Huang, Dong
    Cai, Xiaosha
    Wang, Chang-Dong
    KNOWLEDGE-BASED SYSTEMS, 2019, 182
  • [48] Class Discovery via Bimodal Feature Selection in Unsupervised Settings
    Curtis, Jessica
    Kon, Mark
    2015 IEEE 14TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2015, : 699 - 702
  • [49] Unsupervised nonlinear feature selection algorithm via kernel function
    Jiaye Li
    Shichao Zhang
    Leyuan Zhang
    Cong Lei
    Jilian Zhang
    Neural Computing and Applications, 2020, 32 : 6443 - 6454
  • [50] Multi-step incremental forming using local feature based toolpaths
    Carette, Yannick
    Vanhove, Hans
    Duflou, Joost
    18TH INTERNATIONAL CONFERENCE ON SHEET METAL, SHEMET 2019 - NEW TRENDS AND DEVELOPMENTS IN SHEET METAL PROCESSING, 2019, 29 : 28 - 35