Improved Pathwise Coordinate Descent for Power Penalties

被引:0
|
作者
Griffin, Maryclare [1 ,2 ]
机构
[1] Univ Massachusetts Amherst, Dept Math & Stat, Amherst, MA USA
[2] Univ Massachusetts Amherst, Dept Math & Stat, Amherst, MA 01003 USA
关键词
Coordinate descent; LASSO; Nonconvex optimization; Regularization surface; Sparse regression; REGRESSION; SELECTION; BRIDGE;
D O I
10.1080/10618600.2023.2256807
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Pathwise coordinate descent algorithms have been used to compute entire solution paths for lasso and other penalized regression problems quickly with great success. They improve upon cold start algorithms by solving the problems that make up the solution path sequentially for an ordered set of tuning parameter values, instead of solving each problem separastely. However, extending pathwise coordinate descent algorithms to more the general bridge or power family of l(q) penalties is challenging. Faster algorithms for computing solution paths for these penalties are needed because l(q) penalized regression problems can be nonconvex and especially burdensome to solve. In this paper, we show that a reparameterization of l(q) penalized regression problems is more amenable to pathwise coordinate descent algorithms. This allows us to improve computation of the mode-thresholding function for l(q) penalized regression problems in practice and introduce two separate pathwise algorithms. We show that either pathwise algorithm is faster than the corresponding cold start alternative, and demonstrate that different pathwise algorithms may be more likely to reach better solutions. Supplemental materials for this article are available online.
引用
收藏
页码:310 / 315
页数:6
相关论文
共 50 条
  • [11] Improved Iteration Complexity Bounds of Cyclic Block Coordinate Descent for Convex Problems
    Sun, Ruoyu
    Hong, Mingyi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [12] Discrete Coordinate Descent (DCD)
    Farsa, Davood Zaman
    Rahnamayan, Shahryar
    2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 184 - 190
  • [13] On the complexity of parallel coordinate descent
    Tappenden, Rachael
    Takac, Martin
    Richtarik, Peter
    OPTIMIZATION METHODS & SOFTWARE, 2018, 33 (02): : 372 - 395
  • [14] Coordinate Descent with Bandit Sampling
    Salehi, Farnood
    Thiran, Patrick
    Celis, L. Elisa
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [15] A flexible coordinate descent method
    Fountoulakis, Kimon
    Tappenden, Rachael
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2018, 70 (02) : 351 - 394
  • [16] On Matching Pursuit and Coordinate Descent
    Locatello, Francesco
    Raj, Anant
    Karimireddy, Sai Praneeth
    Raetsch, Gunnar
    Schoelkopf, Bernhard
    Stich, Sebastian U.
    Jaggi, Martin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [17] Approximate Steepest Coordinate Descent
    Stich, Sebastian U.
    Raj, Anant
    Jaggi, Martin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [18] Anderson acceleration of coordinate descent
    Bertrand, Quentin
    Massias, Mathurin
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [19] Fixed and Random Effects Selection by REML and Pathwise Coordinate Optimization
    Lin, Bingqing
    Pang, Zhen
    Jiang, Jiming
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2013, 22 (02) : 341 - 355
  • [20] A flexible coordinate descent method
    Kimon Fountoulakis
    Rachael Tappenden
    Computational Optimization and Applications, 2018, 70 : 351 - 394