Efficient nonconvex sparse group feature selection via continuous and discrete optimization

被引:18
|
作者
Xiang, Shuo [1 ,2 ]
Shen, Xiaotong [3 ]
Ye, Jieping [1 ,2 ]
机构
[1] Arizona State Univ, Ctr Evolutionary Med & Informat, Tempe, AZ 85287 USA
[2] Arizona State Univ, Comp Sci & Engn, Tempe, AZ 85287 USA
[3] Univ Minnesota, Sch Stat, Minneapolis, MN 55347 USA
基金
美国国家科学基金会;
关键词
Nonconvex optimization; Error bound; Discrete optimization; Application; EEC data analysis; MULTISTAGE CONVEX RELAXATION; MODEL SELECTION; LIKELIHOOD; REGRESSION; SHRINKAGE; ALGORITHM; LASSO;
D O I
10.1016/j.artint.2015.02.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse feature selection has proven to be effective in analyzing high-dimensional data. While promising, most existing works apply convex methods, which may be suboptimal in terms of the accuracy of feature selection and parameter estimation. In this paper, we consider both continuous and discrete nonconvex paradigms to sparse group feature selection, which are motivated by applications that require identifying the underlying group structure and performing feature selection simultaneously. The main contribution of this article is twofold: (1) computationally, we develop efficient optimization algorithms for both continuous and discrete formulations, of which the key step is a projection with two coupled constraints; (2) statistically, we show that the proposed continuous model reconstructs the oracle estimator. Therefore, consistent feature selection and parameter estimation are achieved simultaneously. Numerical results on synthetic and real-world data suggest that the proposed nonconvex methods compare favorably against their competitors, thus achieving desired goal of delivering high performance. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:28 / 50
页数:23
相关论文
共 50 条
  • [31] Tagging Chinese Microblogger via Sparse Feature Selection
    Shang, Di
    Dai, Xin-Yu
    Huang, Shujian
    Li, Yi
    Chen, Jiajun
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 2460 - 2467
  • [32] Robust tracking via discriminative sparse feature selection
    Jin Zhan
    Zhuo Su
    Hefeng Wu
    Xiaonan Luo
    The Visual Computer, 2015, 31 : 575 - 588
  • [33] Robust feature selection via nonconvex sparsity-based methods
    An N.T.
    Dong P.D.
    Qin X.
    Journal of Nonlinear and Variational Analysis, 2021, 5 (01): : 59 - 77
  • [34] ROBUST FEATURE SELECTION VIA NONCONVEX SPARSITY-BASED METHODS
    Nguyen Thai An
    Pham Dinh Dong
    Qin, Xiaolong
    JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS, 2021, 5 (01): : 59 - 77
  • [35] Efficient Feature Selection via l2,0-norm Constrained Sparse Regression
    Pang, Tianji
    Nie, Feiping
    Han, Junwei
    Li, Xuelong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2019, 31 (05) : 880 - 893
  • [36] Group variable selection via group sparse neural network
    Zhang, Xin
    Zhao, Junlong
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2024, 192
  • [37] Learning Collaborative Sparsity Structure via Nonconvex Optimization for Feature Recognition
    Du, Zhaohui
    Chen, Xuefeng
    Zhang, Han
    Yan, Ruqiang
    Yin, Wotao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2018, 14 (10) : 4417 - 4430
  • [38] Feature cluster on adaptation of discrete metaheuristics to continuous optimization
    Michalewicz, Zbigniew
    Siarry, Patrick
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2008, 185 (03) : 1060 - 1061
  • [39] Multi-target Tracking with Sparse Group Features and Position Using Discrete-Continuous Optimization
    Peralta, Billy
    Soto, Alvaro
    COMPUTER VISION - ACCV 2014 WORKSHOPS, PT III, 2015, 9010 : 680 - 694
  • [40] Structured Sparsity of Convolutional Neural Networks via Nonconvex Sparse Group Regularization
    Bui, Kevin
    Park, Fredrick
    Zhang, Shuai
    Qi, Yingyong
    Xin, Jack
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 6