Efficient nonconvex sparse group feature selection via continuous and discrete optimization

被引:18
|
作者
Xiang, Shuo [1 ,2 ]
Shen, Xiaotong [3 ]
Ye, Jieping [1 ,2 ]
机构
[1] Arizona State Univ, Ctr Evolutionary Med & Informat, Tempe, AZ 85287 USA
[2] Arizona State Univ, Comp Sci & Engn, Tempe, AZ 85287 USA
[3] Univ Minnesota, Sch Stat, Minneapolis, MN 55347 USA
基金
美国国家科学基金会;
关键词
Nonconvex optimization; Error bound; Discrete optimization; Application; EEC data analysis; MULTISTAGE CONVEX RELAXATION; MODEL SELECTION; LIKELIHOOD; REGRESSION; SHRINKAGE; ALGORITHM; LASSO;
D O I
10.1016/j.artint.2015.02.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse feature selection has proven to be effective in analyzing high-dimensional data. While promising, most existing works apply convex methods, which may be suboptimal in terms of the accuracy of feature selection and parameter estimation. In this paper, we consider both continuous and discrete nonconvex paradigms to sparse group feature selection, which are motivated by applications that require identifying the underlying group structure and performing feature selection simultaneously. The main contribution of this article is twofold: (1) computationally, we develop efficient optimization algorithms for both continuous and discrete formulations, of which the key step is a projection with two coupled constraints; (2) statistically, we show that the proposed continuous model reconstructs the oracle estimator. Therefore, consistent feature selection and parameter estimation are achieved simultaneously. Numerical results on synthetic and real-world data suggest that the proposed nonconvex methods compare favorably against their competitors, thus achieving desired goal of delivering high performance. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:28 / 50
页数:23
相关论文
共 50 条
  • [1] Nonconvex Regularizations for Feature Selection in Ranking With Sparse SVM
    Laporte, Lea
    Flamary, Remi
    Canu, Stephane
    Dejean, Sebastien
    Mothe, Josiane
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (06) : 1118 - 1130
  • [2] Sparse optimization for nonconvex group penalized estimation
    Lee, Sangin
    Oh, Miae
    Kim, Yongdai
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2016, 86 (03) : 597 - 610
  • [3] Stable sparse approximations via nonconvex optimization
    Saab, Rayan
    Chartrand, Rick
    Yilmaz, Oezguer
    2008 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-12, 2008, : 3885 - +
  • [4] Group Feature Selection Via Structural Sparse Logistic Regression for IDS
    Shah, Reehan Ali
    Qian, Yuntao
    Mahdi, Ghulam
    PROCEEDINGS OF 2016 IEEE 18TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING AND COMMUNICATIONS; IEEE 14TH INTERNATIONAL CONFERENCE ON SMART CITY; IEEE 2ND INTERNATIONAL CONFERENCE ON DATA SCIENCE AND SYSTEMS (HPCC/SMARTCITY/DSS), 2016, : 594 - 600
  • [5] Learning Markov Blankets for Continuous or Discrete Networks via Feature Selection
    Deng, Houtao
    Davila, Saylisse
    Runger, George
    Tuv, Eugene
    ENSEMBLES IN MACHINE LEARNING APPLICATIONS, 2011, 373 : 117 - +
  • [6] RLT: A unified approach for discrete and continuous nonconvex optimization
    Hanif D. Sherali
    Annals of Operations Research, 2007, 149 : 185 - 193
  • [7] RLT: A unified approach for discrete and continuous nonconvex optimization
    Sherali, Hanif D.
    ANNALS OF OPERATIONS RESEARCH, 2007, 149 (01) : 185 - 193
  • [8] Collinear groupwise feature selection via discrete fusion group regression
    Kim, Younghoon
    Kim, Seoung Bum
    PATTERN RECOGNITION, 2018, 83 : 1 - 13
  • [9] Group Sparse Representation Based on Feature Selection and Dictionary Optimization for Expression Recognition
    Xie H.
    Li M.
    Wang Y.
    Chen H.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2021, 34 (05): : 446 - 454
  • [10] Unsupervised feature selection via joint local learning and group sparse regression
    Yue WU
    Can WANG
    Yue-qing ZHANG
    Jia-jun BU
    Frontiers of Information Technology & Electronic Engineering, 2019, 20 (04) : 538 - 553