Ultra-high-dimensional feature screening of binary categorical response data based on Jensen-Shannon divergence

被引:0
|
作者
Jiang, Qingqing [1 ]
Deng, Guangming [1 ,2 ]
机构
[1] Guilin Univ Technol, Sch Math & Stat, Guilin 541000, Guangxi, Peoples R China
[2] Guilin Univ Technol, Appl Stat Inst, Guilin 541000, Guangxi, Peoples R China
来源
AIMS MATHEMATICS | 2024年 / 9卷 / 02期
基金
中国国家自然科学基金;
关键词
ultra-high-dimensional; binary categorical; Jensen-Shannon divergence; model-free; feature screening; SELECTION; MODELS;
D O I
10.3934/math.2024142
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Currently, most of the ultra-high-dimensional feature screening methods for categorical data are based on the correlation between covariates and response variables, using some statistics as the screening index to screen important covariates. Thus, with the increasing number of data types and model availability limitations, there may be a potential problem with the existence of a class of unimportant covariates that are also highly correlated with the response variable due to their high correlation with the other covariates. To address this issue, in this paper, we establish a model-free feature screening procedure for binary categorical response variables from the perspective of the contribution of features to classification. The idea is to introduce the Jensen-Shannon divergence to measure the difference between the conditional probability distributions of the covariates when the response variables take on different values. The larger the value of the Jensen-Shannon divergence, the stronger the covariate's contribution to the classification of the response variable, and the more important the covariate is. We propose two kinds of model-free ultra-high-dimensional feature screening methods for binary response data. Meanwhile, the methods are suitable for continuous or categorical covariates. When the numbers of covariate categories are the same, the feature screening is based on traditional Jensen-Shannon divergence. When the numbers of covariate categories are different, the Jensen-Shannon divergence is adjusted using the logarithmic factor of the number of categories. We theoretically prove that the proposed methods have sure screening and ranking consistency properties, and through simulations and real data analysis, we demonstrate that, in feature screening, the approaches proposed in this paper have the advantages of effectiveness, stability, and less computing time compared with an existing method.
引用
收藏
页码:2874 / 2907
页数:34
相关论文
共 48 条
  • [21] Joint feature screening for ultra-high-dimensional sparse additive hazards model by the sparsity-restricted pseudo-score estimator
    Chen, Xiaolin
    Liu, Yi
    Wang, Qihua
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2019, 71 (05) : 1007 - 1031
  • [22] Joint feature screening for ultra-high-dimensional sparse additive hazards model by the sparsity-restricted pseudo-score estimator
    Xiaolin Chen
    Yi Liu
    Qihua Wang
    Annals of the Institute of Statistical Mathematics, 2019, 71 : 1007 - 1031
  • [23] The VAE-FastGA anomaly detection model based on subspace and weakly correlated ultra-high-dimensional data
    Wan, Junhang
    Chen, Yanping
    Gao, Cong
    JOURNAL OF SUPERCOMPUTING, 2025, 81 (03):
  • [24] High Dimensional Data Clustering Algorithm Based on Sparse Feature Vector for Categorical Attributes
    Wu, Sen
    Wei, Guiying
    PROCEEDINGS OF 2010 INTERNATIONAL CONFERENCE ON LOGISTICS SYSTEMS AND INTELLIGENT MANAGEMENT, VOLS 1-3, 2010, : 973 - 976
  • [25] Deep feature screening: Feature selection for ultra high-dimensional data via deep neural networks
    Li, Kexuan
    Wang, Fangfang
    Yang, Lingli
    Liu, Ruiqi
    NEUROCOMPUTING, 2023, 538
  • [26] A BINARY KRILL HERD APPROACH BASED FEATURE SELECTION FOR HIGH DIMENSIONAL DATA
    Shahana, A. H.
    Preeja, V
    2016 INTERNATIONAL CONFERENCE ON INVENTIVE COMPUTATION TECHNOLOGIES (ICICT), VOL 2, 2016, : 297 - 302
  • [27] A BINARY KRILL HERD APPROACH BASED FEATURE SELECTION FOR HIGH DIMENSIONAL DATA
    Shahana, A. H.
    Preeja, V
    2016 INTERNATIONAL CONFERENCE ON INVENTIVE COMPUTATION TECHNOLOGIES (ICICT), VOL 3, 2015, : 630 - 635
  • [28] Binary coding based feature extraction in remote sensing high dimensional data
    Imani, Maryam
    Ghassemian, Hassan
    INFORMATION SCIENCES, 2016, 342 : 191 - 208
  • [29] Sequential Feature Screening for Generalized Linear Models with Sparse Ultra-High Dimensional Data
    ZHANG Junying
    WANG Hang
    ZHANG Riquan
    ZHANG Jiajia
    Journal of Systems Science & Complexity, 2020, 33 (02) : 510 - 526
  • [30] Sequential Feature Screening for Generalized Linear Models with Sparse Ultra-High Dimensional Data
    Zhang, Junying
    Wang, Hang
    Zhang, Riquan
    Zhang, Jiajia
    JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY, 2020, 33 (02) : 510 - 526