Variables selection using L0 penalty

被引:0
|
作者
Zhang, Tonglin [1 ]
机构
[1] Purdue Univ, Dept Stat, 150 North Univ St, W Lafayette, IN 47907 USA
关键词
Consistency; Generalized information criterion; Generalized linear models; High -dimensional data; Model size; Penalized maximum likelihood; CENTRAL LIMIT-THEOREMS; TUNING PARAMETER SELECTION; REGRESSION; REGULARIZATION; SUBSET; MODELS; LASSO;
D O I
10.1016/j.csda.2023.107860
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The determination of a tuning parameter by the generalized information criterion (GIC) is considered an important issue in variable selection. It is shown that the GIC and the G0 penalized objective functions are equivalent, leading to a new G0 penalized maximum likelihood method for high-dimensional generalized linear models in this article. Based on the technique of the well-known discrete optimization problem in theoretical computer science, a two-step algorithm for local solutions is proposed. The first step optimizes the G0 penalized objective function under a given model size, where only a maximum likelihood algorithm is needed. The second step optimizes the G0 penalized objective function under a candidate set of model sizes, where only the GIC is needed. As the tuning parameter can be fixed, the selection of the tuning parameter can be ignored in the proposed method. The theoretical study shows that the algorithm is polynomial and any resulting local solution is consistent. Thus, it is not necessary to use the global solution in practice. The numerical studies show that the proposed method outperforms its competitors in general.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] Gravity inversion using L0 norm for sparse constraints
    Zhu, Dan
    Hu, Xiangyun
    Liu, Shuang
    Cai, Hongzhu
    Xu, Shan
    Meng, Linghui
    Zhang, Henglei
    GEOPHYSICAL JOURNAL INTERNATIONAL, 2023, 236 (02) : 904 - 923
  • [42] L0 SPARSE GRAPHICAL MODELING
    Marjanovic, Goran
    Solo, Victor
    2011 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2011, : 2084 - 2087
  • [43] Lacunary convergence of series in L0
    Drewnowski, L
    Labuda, I
    PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 1998, 126 (06) : 1655 - 1659
  • [44] Learning Deep l0 Encoders
    Wang, Zhangyang
    Ling, Qing
    Huang, Thomas S.
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2194 - 2200
  • [45] Sparse regularization with the l0 norm
    Xu, Yuesheng
    ANALYSIS AND APPLICATIONS, 2023, 21 (04) : 901 - 929
  • [46] ADMM for l0 Factor Analysis
    Wang, Linyang
    Liu, Wanquan
    Zhu, Bin
    2024 IEEE 13RD SENSOR ARRAY AND MULTICHANNEL SIGNAL PROCESSING WORKSHOP, SAM 2024, 2024,
  • [47] Fuzzy clustering with L0 regularization
    Ferraro, Maria Brigida
    Forti, Marco
    Giordani, Paolo
    ANNALS OF OPERATIONS RESEARCH, 2025,
  • [48] Nonlinear centralizers with values in L0
    Cabello Sanchez, Felix
    NONLINEAR ANALYSIS-THEORY METHODS & APPLICATIONS, 2013, 88 : 42 - 50
  • [49] Interpolation between L0(M, τ) and L∞(M, τ)
    Huang, J.
    Sukochev, F.
    MATHEMATISCHE ZEITSCHRIFT, 2019, 293 (3-4) : 1657 - 1672
  • [50] Entropic Regularization of the l0 Function
    Borwein, Jonathan M.
    Luke, D. Russell
    FIXED-POINT ALGORITHMS FOR INVERSE PROBLEMS IN SCIENCE AND ENGINEERING, 2011, 49 : 65 - +