Flexible and Comprehensive Framework of Element Selection Based on Nonconvex Sparse Optimization

被引:1
|
作者
Kawamura, Taiga [1 ]
Ueno, Natsuki [1 ]
Ono, Nobutaka [1 ]
机构
[1] Tokyo Metropolitan Univ, Grad Sch Syst Design, Tokyo 1910065, Japan
基金
日本科学技术振兴机构;
关键词
Optimization; Relaxation methods; Minimization; Signal processing; Dimensionality reduction; Sparse matrices; Indexes; element selection; sparse optimization; proximal operator; Douglas-Rachford splitting method; REGULARIZATION; ALGORITHMS;
D O I
10.1109/ACCESS.2024.3361941
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose an element selection method for high-dimensional data that is applicable to a wide range of optimization criteria in a unifying manner. Element selection is a fundamental technique for reducing dimensionality of high-dimensional data by simple operations without the use of scalar multiplication. Restorability is one of the commonly used criteria in element selection, and the element selection problem based on restorability is formulated as a minimization problem of a loss function representing the restoration error between the original data and the restored data. However, conventional methods are applicable only to a limited class of loss functions such as & ell;(2) norm loss. To enable the use of a wide variety of criteria, we reformulate the element selection problem as a nonconvex sparse optimization problem and derive the optimization algorithm based on Douglas-Rachford splitting method. The proposed algorithm is applicable to any loss function as long as its proximal operator is available, e.g., & ell;(1) norm loss and & ell;(infinity) norm loss as well as & ell;(2) norm loss. We conducted numerical experiments using artificial and real data, and their results indicate that the above loss functions are successfully minimized by the proposed algorithm.
引用
收藏
页码:21337 / 21346
页数:10
相关论文
共 50 条
  • [21] Sparse Framework Based Static Taint Analysis Optimization
    Wang L.
    He D.
    Li L.
    Feng X.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2019, 56 (03): : 480 - 495
  • [22] Sparse estimation based on square root nonconvex optimization in high-dimensional data
    Jiang, He
    NEUROCOMPUTING, 2018, 282 : 122 - 135
  • [23] Multilevel preconditioning for sparse optimization of functionals with nonconvex fidelity terms
    Dahlke, Stephan
    Fornasier, Massimo
    Friedrich, Ulrich
    Raasch, Thorsten
    JOURNAL OF INVERSE AND ILL-POSED PROBLEMS, 2015, 23 (04): : 393 - 414
  • [24] Trust-Region Methods for Nonconvex Sparse Recovery Optimization
    Adhikari, Lasith
    Marcia, Roummel F.
    Erway, Jennifer B.
    Plemmons, Robert J.
    PROCEEDINGS OF 2016 INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY AND ITS APPLICATIONS (ISITA 2016), 2016, : 275 - 279
  • [25] Nonconvex Sparse Regularization and Convex Optimization for Bearing Fault Diagnosis
    Wang, Shibin
    Selesnick, Ivan
    Cai, Gaigai
    Feng, Yining
    Sui, Xin
    Chen, Xuefeng
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2018, 65 (09) : 7332 - 7342
  • [26] Sparse Blind Deconvolution with Nonconvex Optimization for Ultrasonic NDT Application
    Gao, Xuyang
    Shi, Yibing
    Du, Kai
    Zhu, Qi
    Zhang, Wei
    SENSORS, 2020, 20 (23) : 1 - 14
  • [27] Sparse Frechet sufficient dimension reduction via nonconvex optimization
    Weng, Jiaying
    Ke, Chenlu
    Wang, Pei
    CONFERENCE ON PARSIMONY AND LEARNING, VOL 234, 2024, 234 : 39 - 53
  • [28] Particle swarm optimization algorithm based on comprehensive scoring framework for high-dimensional feature selection
    Wei, Bo
    Yang, Shanshan
    Zha, Wentao
    Deng, Li
    Huang, Jiangyi
    Su, Xiaohui
    Wang, Feng
    SWARM AND EVOLUTIONARY COMPUTATION, 2025, 95
  • [29] NONCONVEX L1/2 REGULARIZATION FOR SPARSE PORTFOLIO SELECTION
    Xu, Fengmin
    Wang, Guan
    Gao, Yuelin
    PACIFIC JOURNAL OF OPTIMIZATION, 2014, 10 (01): : 163 - 176
  • [30] A concave optimization-based approach for sparse portfolio selection
    Di Lorenzo, D.
    Liuzzi, G.
    Rinaldi, F.
    Schoen, F.
    Sciandrone, M.
    OPTIMIZATION METHODS & SOFTWARE, 2012, 27 (06): : 983 - 1000