Improved Pathwise Coordinate Descent for Power Penalties

被引:0
|
作者
Griffin, Maryclare [1 ,2 ]
机构
[1] Univ Massachusetts Amherst, Dept Math & Stat, Amherst, MA USA
[2] Univ Massachusetts Amherst, Dept Math & Stat, Amherst, MA 01003 USA
关键词
Coordinate descent; LASSO; Nonconvex optimization; Regularization surface; Sparse regression; REGRESSION; SELECTION; BRIDGE;
D O I
10.1080/10618600.2023.2256807
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Pathwise coordinate descent algorithms have been used to compute entire solution paths for lasso and other penalized regression problems quickly with great success. They improve upon cold start algorithms by solving the problems that make up the solution path sequentially for an ordered set of tuning parameter values, instead of solving each problem separastely. However, extending pathwise coordinate descent algorithms to more the general bridge or power family of l(q) penalties is challenging. Faster algorithms for computing solution paths for these penalties are needed because l(q) penalized regression problems can be nonconvex and especially burdensome to solve. In this paper, we show that a reparameterization of l(q) penalized regression problems is more amenable to pathwise coordinate descent algorithms. This allows us to improve computation of the mode-thresholding function for l(q) penalized regression problems in practice and introduce two separate pathwise algorithms. We show that either pathwise algorithm is faster than the corresponding cold start alternative, and demonstrate that different pathwise algorithms may be more likely to reach better solutions. Supplemental materials for this article are available online.
引用
收藏
页码:310 / 315
页数:6
相关论文
共 50 条
  • [1] SparseNet: Coordinate Descent With Nonconvex Penalties
    Mazumder, Rahul
    Friedman, Jerome H.
    Hastie, Trevor
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2011, 106 (495) : 1125 - 1138
  • [2] PATHWISE COORDINATE OPTIMIZATION
    Friedman, Jerome
    Hastie, Trevor
    Hoefling, Holger
    Tibshirani, Robert
    ANNALS OF APPLIED STATISTICS, 2007, 1 (02): : 302 - 332
  • [3] Improved analysis of greedy block coordinate descent under RIP
    Li, Haifeng
    Ma, Yingbin
    Liu, WenAn
    Fu, Yuli
    ELECTRONICS LETTERS, 2015, 51 (06) : 488 - 489
  • [4] When Cyclic Coordinate Descent Outperforms Randomized Coordinate Descent
    Gurbuzbalaban, Mert
    Ozdaglar, Asuman
    Parrilo, Pablo A.
    Vanli, N. Denizcan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [5] Coordinate descent algorithms
    Wright, Stephen J.
    MATHEMATICAL PROGRAMMING, 2015, 151 (01) : 3 - 34
  • [6] Coordinate Descent for SLOPE
    Larsson, Johan
    Klopfenstein, Quentin
    Massias, Mathurin
    Wallin, Jonas
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [7] Adaptive Coordinate Descent
    Loshchilov, Ilya
    Schoenauer, Marc
    Sebag, Michele
    GECCO-2011: PROCEEDINGS OF THE 13TH ANNUAL GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2011, : 885 - 892
  • [8] MINIMIZATION BY COORDINATE DESCENT
    ABATZOGLOU, T
    ODONNELL, B
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1982, 36 (02) : 163 - 174
  • [9] Coordinate descent algorithms
    Stephen J. Wright
    Mathematical Programming, 2015, 151 : 3 - 34
  • [10] PATHWISE COORDINATE OPTIMIZATION FOR SPARSE LEARNING: ALGORITHM AND THEORY
    Zhao, Tuo
    Liu, Han
    Zhang, Tong
    ANNALS OF STATISTICS, 2018, 46 (01): : 180 - 218