Constrained class-wise feature selection (CCFS)

被引:0
|
作者
Syed Fawad Hussain
Fatima Shahzadi
Badre Munir
机构
[1] G.I.K. Institute of Engineering Sciences and Technology,Machine Learning and Data Science Lab (MDS)
[2] G.I.K. Institute,undefined
关键词
Feature selection; Information theory; Classification; Class-wise feature selection;
D O I
暂无
中图分类号
学科分类号
摘要
Feature selection plays a vital role as a preprocessing step for high dimensional data in machine learning. The basic purpose of feature selection is to avoid “curse of dimensionality” and reduce time and space complexity of training data. Several techniques, including those that use information theory, have been proposed in the literature as a means to measure the information content of a feature. Most of them incrementally select features with max dependency with the category but minimum redundancy with already selected features. A key missing idea in these techniques is the fair representation of features with max dependency among the different categories, i.e., skewed selection of features having high mutual information (MI) with a particular class. This can result in a biased classification in favor of that particular class while other classes have low matching scores during classification. We propose a novel approach based on information theory that selects features in a class-wise fashion rather than based on their global max dependency. In addition, a constrained search is used instead of a global sequential forward search. We prove that our proposed approach enhances Maximum Relevance while keeping Minimum Redundancy under a constrained search. Results on multiple benchmark datasets show that our proposed method improves accuracy as compared to other state-of-the-art feature selection algorithms while having a lower time complexity.
引用
收藏
页码:3211 / 3224
页数:13
相关论文
共 50 条
  • [1] Constrained class-wise feature selection (CCFS)
    Hussain, Syed Fawad
    Shahzadi, Fatima
    Munir, Badre
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (10) : 3211 - 3224
  • [2] Class-wise feature extraction technique for multimodal data
    Silva, Elias R., Jr.
    Cavalcanti, George D. C.
    Ren, Tsang Ing
    NEUROCOMPUTING, 2016, 214 : 1001 - 1010
  • [3] Class-wise Information Gain
    Zhang, Pengtao
    Tan, Ying
    2013 INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY (ICIST), 2013, : 972 - 978
  • [4] Deep Class-Wise Hashing: Semantics-Preserving Hashing via Class-Wise Loss
    Zhe, Xuefei
    Chen, Shifeng
    Yan, Hong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (05) : 1681 - 1695
  • [5] Imposing Class-Wise Feature Similarity in Stacked Autoencoders by Nuclear Norm Regularization
    Gupta, Kavya
    Majumdar, Angshul
    NEURAL PROCESSING LETTERS, 2018, 48 (01) : 615 - 629
  • [6] Imposing Class-Wise Feature Similarity in Stacked Autoencoders by Nuclear Norm Regularization
    Kavya Gupta
    Angshul Majumdar
    Neural Processing Letters, 2018, 48 : 615 - 629
  • [7] Class-wise and reduced calibration methods
    Panchenko, Michael
    Benmerzoug, Anes
    Delgado, Miguel de Benito
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 1093 - 1100
  • [8] Class-wise Deep Dictionary Learning
    Singhal, Vanika
    Khurana, Prerna
    Majumdar, Angshul
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 1125 - 1132
  • [9] Improved Open World Object Detection Using Class-Wise Feature Space Learning
    Iqbal, Muhammad Ali
    Yoon, Yeo Chan
    Khan, Muhammad U. S.
    Kim, Soo Kyun
    IEEE ACCESS, 2023, 11 : 131221 - 131236
  • [10] Calculating Class-wise Weighted Feature Norm for Detecting Out-of-distribution Samples
    Yu, Yeonguk
    Shin, Sungho
    Lee, Kyoobin
    2023 20TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS, UR, 2023, : 974 - 979