Flexible and Comprehensive Framework of Element Selection Based on Nonconvex Sparse Optimization

被引:1
|
作者
Kawamura, Taiga [1 ]
Ueno, Natsuki [1 ]
Ono, Nobutaka [1 ]
机构
[1] Tokyo Metropolitan Univ, Grad Sch Syst Design, Tokyo 1910065, Japan
基金
日本科学技术振兴机构;
关键词
Optimization; Relaxation methods; Minimization; Signal processing; Dimensionality reduction; Sparse matrices; Indexes; element selection; sparse optimization; proximal operator; Douglas-Rachford splitting method; REGULARIZATION; ALGORITHMS;
D O I
10.1109/ACCESS.2024.3361941
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose an element selection method for high-dimensional data that is applicable to a wide range of optimization criteria in a unifying manner. Element selection is a fundamental technique for reducing dimensionality of high-dimensional data by simple operations without the use of scalar multiplication. Restorability is one of the commonly used criteria in element selection, and the element selection problem based on restorability is formulated as a minimization problem of a loss function representing the restoration error between the original data and the restored data. However, conventional methods are applicable only to a limited class of loss functions such as & ell;(2) norm loss. To enable the use of a wide variety of criteria, we reformulate the element selection problem as a nonconvex sparse optimization problem and derive the optimization algorithm based on Douglas-Rachford splitting method. The proposed algorithm is applicable to any loss function as long as its proximal operator is available, e.g., & ell;(1) norm loss and & ell;(infinity) norm loss as well as & ell;(2) norm loss. We conducted numerical experiments using artificial and real data, and their results indicate that the above loss functions are successfully minimized by the proposed algorithm.
引用
收藏
页码:21337 / 21346
页数:10
相关论文
共 50 条
  • [1] Sparse Distortionless Beamformer Based on Nonconvex Optimization
    Kawamura, Taiga
    Yatabe, Kohei
    Miyazaki, Ryoichi
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 281 - 285
  • [2] A Flexible Framework for Cubic Regularization Algorithms for Nonconvex Optimization in Function Space
    Schiela, Anton
    NUMERICAL FUNCTIONAL ANALYSIS AND OPTIMIZATION, 2019, 40 (01) : 85 - 118
  • [3] Efficient nonconvex sparse group feature selection via continuous and discrete optimization
    Xiang, Shuo
    Shen, Xiaotong
    Ye, Jieping
    ARTIFICIAL INTELLIGENCE, 2015, 224 : 28 - 50
  • [4] DISTRIBUTED NONCONVEX OPTIMIZATION FOR SPARSE REPRESENTATION
    Sun, Ying
    Scutari, Gesualdo
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4044 - 4048
  • [5] A General Framework for Nonconvex Sparse Mean-CVaR Portfolio Optimization Via ADMM
    Sun, Ke-Xin
    Wu, Zhong-Ming
    Wan, Neng
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2024, 12 (04) : 1022 - 1047
  • [6] Sparse optimization for nonconvex group penalized estimation
    Lee, Sangin
    Oh, Miae
    Kim, Yongdai
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2016, 86 (03) : 597 - 610
  • [7] Stable sparse approximations via nonconvex optimization
    Saab, Rayan
    Chartrand, Rick
    Yilmaz, Oezguer
    2008 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-12, 2008, : 3885 - +
  • [8] Nonconvex Regularizations for Feature Selection in Ranking With Sparse SVM
    Laporte, Lea
    Flamary, Remi
    Canu, Stephane
    Dejean, Sebastien
    Mothe, Josiane
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (06) : 1118 - 1130
  • [9] NONCONVEX REGULARIZER AND HOMOTOPY-BASED SPARSE OPTIMIZATION: CONVERGENT ALGORITHMS AND APPLICATIONS
    Huang, Zilin
    Jiang, Lanfan
    Cao, Weiwei
    Zhu, Wenxing
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2025,
  • [10] Misspecified nonconvex statistical optimization for sparse phase retrieval
    Zhuoran Yang
    Lin F. Yang
    Ethan X. Fang
    Tuo Zhao
    Zhaoran Wang
    Matey Neykov
    Mathematical Programming, 2019, 176 : 545 - 571