Efficient nonconvex sparse group feature selection via continuous and discrete optimization

被引:18
|
作者
Xiang, Shuo [1 ,2 ]
Shen, Xiaotong [3 ]
Ye, Jieping [1 ,2 ]
机构
[1] Arizona State Univ, Ctr Evolutionary Med & Informat, Tempe, AZ 85287 USA
[2] Arizona State Univ, Comp Sci & Engn, Tempe, AZ 85287 USA
[3] Univ Minnesota, Sch Stat, Minneapolis, MN 55347 USA
基金
美国国家科学基金会;
关键词
Nonconvex optimization; Error bound; Discrete optimization; Application; EEC data analysis; MULTISTAGE CONVEX RELAXATION; MODEL SELECTION; LIKELIHOOD; REGRESSION; SHRINKAGE; ALGORITHM; LASSO;
D O I
10.1016/j.artint.2015.02.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse feature selection has proven to be effective in analyzing high-dimensional data. While promising, most existing works apply convex methods, which may be suboptimal in terms of the accuracy of feature selection and parameter estimation. In this paper, we consider both continuous and discrete nonconvex paradigms to sparse group feature selection, which are motivated by applications that require identifying the underlying group structure and performing feature selection simultaneously. The main contribution of this article is twofold: (1) computationally, we develop efficient optimization algorithms for both continuous and discrete formulations, of which the key step is a projection with two coupled constraints; (2) statistically, we show that the proposed continuous model reconstructs the oracle estimator. Therefore, consistent feature selection and parameter estimation are achieved simultaneously. Numerical results on synthetic and real-world data suggest that the proposed nonconvex methods compare favorably against their competitors, thus achieving desired goal of delivering high performance. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:28 / 50
页数:23
相关论文
共 50 条
  • [21] Flexible and Comprehensive Framework of Element Selection Based on Nonconvex Sparse Optimization
    Kawamura, Taiga
    Ueno, Natsuki
    Ono, Nobutaka
    IEEE ACCESS, 2024, 12 : 21337 - 21346
  • [22] Sparse group LASSO based uncertain feature selection
    Zongxia Xie
    Yong Xu
    International Journal of Machine Learning and Cybernetics, 2014, 5 : 201 - 210
  • [23] Feature Selection With Group-Sparse Stochastic Gates
    Park, Hyeryn
    Lee, Changhee
    IEEE ACCESS, 2024, 12 : 102299 - 102312
  • [24] Sparse group LASSO based uncertain feature selection
    Xie, Zongxia
    Xu, Yong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2014, 5 (02) : 201 - 210
  • [25] Nonconvex and Nonsmooth Sparse Optimization via Adaptively Iterative Reweighted Methods
    Hao Wang
    Fan Zhang
    Yuanming Shi
    Yaohua Hu
    Journal of Global Optimization, 2021, 81 : 717 - 748
  • [26] Nonconvex and Nonsmooth Sparse Optimization via Adaptively Iterative Reweighted Methods
    Wang, Hao
    Zhang, Fan
    Shi, Yuanming
    Hu, Yaohua
    JOURNAL OF GLOBAL OPTIMIZATION, 2021, 81 (03) : 717 - 748
  • [27] Feature selection with interactions for continuous attributes and discrete class
    Mejia-Lavalle, Manuel
    Rodriguez, Guillermo
    CERMA 2007: ELECTRONICS, ROBOTICS AND AUTOMOTIVE MECHANICS CONFERENCE, PROCEEDINGS, 2007, : 318 - +
  • [28] Sparse Neural Additive Model: Interpretable Deep Learning with Feature Selection via Group Sparsity
    Xu, Shiyun
    Bu, Zhiqi
    Chaudhari, Pratik
    Barnett, Ian J.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT III, 2023, 14171 : 343 - 359
  • [29] Feature selection via genetic optimization
    Salcedo-Sanz, S
    Prado-Cumplido, M
    Pérez-Cruz, F
    Bousoño-Calzón, C
    ARTIFICIAL NEURAL NETWORKS - ICANN 2002, 2002, 2415 : 547 - 552
  • [30] Robust tracking via discriminative sparse feature selection
    Zhan, Jin
    Su, Zhuo
    Wu, Hefeng
    Luo, Xiaonan
    VISUAL COMPUTER, 2015, 31 (05): : 575 - 588