A Two-Phase Approach for Semi-Supervised Feature Selection

被引:0
|
作者
Saxena, Amit [1 ]
Pare, Shreya [2 ]
Meena, Mahendra Singh [2 ]
Gupta, Deepak [3 ]
Gupta, Akshansh [4 ]
Razzak, Imran [5 ]
Lin, Chin-Teng [2 ]
Prasad, Mukesh [2 ]
机构
[1] Guru Ghasidas Univ, Dept Comp Sci & Informat Technol, Bilaspur 495009, Chhattisgarh, India
[2] Univ Technol Sydney, Sch Comp Sci, FEIT, Sydney, NSW 2007, Australia
[3] Natl Inst Technol Arunachal Pradesh, Dept Comp Sci & Engn, Yupia 791112, India
[4] Cent Elect Engn Res Inst, Delhi 110028, India
[5] Deakin Univ, Sch Informat Technol, Geeloing, Vic 3217, Australia
基金
澳大利亚研究理事会;
关键词
feature selection; semi-supervised datasets; classification; clustering; correlation; RECOGNITION;
D O I
10.3390/a13090215
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes a novel approach for selecting a subset of features in semi-supervised datasets where only some of the patterns are labeled. The whole process is completed in two phases. In the first phase, i.e., Phase-I, the whole dataset is divided into two parts: The first part, which contains labeled patterns, and the second part, which contains unlabeled patterns. In the first part, a small number of features are identified using well-known maximum relevance (from first part) and minimum redundancy (whole dataset) based feature selection approaches using the correlation coefficient. The subset of features from the identified set of features, which produces a high classification accuracy using any supervised classifier from labeled patterns, is selected for later processing. In the second phase, i.e., Phase-II, the patterns belonging to the first and second part are clustered separately into the available number of classes of the dataset. In the clusters of the first part, take the majority of patterns belonging to a cluster as the class for that cluster, which is given already. Form the pairs of cluster centroids made in the first and second part. The centroid of the second part nearest to a centroid of the first part will be paired. As the class of the first centroid is known, the same class can be assigned to the centroid of the cluster of the second part, which is unknown. The actual class of the patterns if known for the second part of the dataset can be used to test the classification accuracy of patterns in the second part. The proposed two-phase approach performs well in terms of classification accuracy and number of features selected on the given benchmarked datasets.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] A new feature selection approach based on ensemble methods in semi-supervised classification
    Settouti, Nesma
    Chikh, Mohamed Amine
    Barra, Vincent
    PATTERN ANALYSIS AND APPLICATIONS, 2017, 20 (03) : 673 - 686
  • [22] A new feature selection approach based on ensemble methods in semi-supervised classification
    Nesma Settouti
    Mohamed Amine Chikh
    Vincent Barra
    Pattern Analysis and Applications, 2017, 20 : 673 - 686
  • [23] Semi-supervised neighborhood discrimination index for feature selection
    Pang, Qing-Qing
    Zhang, Li
    KNOWLEDGE-BASED SYSTEMS, 2020, 204
  • [24] Local-to-Global Semi-Supervised Feature Selection
    Hindawi, Mohammed
    Benabdeslem, Khalid
    PROCEEDINGS OF THE 22ND ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM'13), 2013, : 2159 - 2167
  • [25] Semi-supervised feature selection via multiobjective optimization
    Handl, Julia
    Knowles, Joshua
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 3319 - +
  • [26] Constrained Laplacian Score for Semi-supervised Feature Selection
    Benabdeslem, Khalid
    Hindawi, Mohammed
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT I, 2011, 6911 : 204 - 218
  • [27] Semi-Supervised Fuzzy-Rough Feature Selection
    Jensen, Richard
    Vluymans, Sarah
    Mac Parthalain, Neil
    Cornelis, Chris
    Saeys, Yvan
    ROUGH SETS, FUZZY SETS, DATA MINING, AND GRANULAR COMPUTING, RSFDGRC 2015, 2015, 9437 : 185 - 195
  • [28] Semi-supervised local feature selection for data classification
    Zechao Li
    Jinhui Tang
    Science China Information Sciences, 2021, 64
  • [29] IPIC SEPARABILITY RATIO FOR SEMI-SUPERVISED FEATURE SELECTION
    Yeung, Daniel S.
    Wang, Jun
    Ng, Wing W. Y.
    PROCEEDINGS OF 2009 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-6, 2009, : 399 - +
  • [30] Semi-supervised local feature selection for data classification
    Zechao LI
    Jinhui TANG
    ScienceChina(InformationSciences), 2021, 64 (09) : 127 - 138