Coordinate Descent for SLOPE

被引:0
|
作者
Larsson, Johan [1 ]
Klopfenstein, Quentin [2 ]
Massias, Mathurin [3 ]
Wallin, Jonas [1 ]
机构
[1] Lund Univ, Dept Stat, Lund, Sweden
[2] Univ Luxembourg, Luxembourg Ctr Syst Biomed, Luxembourg, Luxembourg
[3] UCB Lyon 1, INRIA, CNRS, ENS Lyon,Univ Lyon,LIP UMR 5668, F-69342 Lyon, France
关键词
VARIABLE SELECTION; REGRESSION; ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The lasso is the most famous sparse regression and feature selection method. One reason for its popularity is the speed at which the underlying optimization problem can be solved. Sorted L-One Penalized Estimation (SLOPE) is a generalization of the lasso with appealing statistical properties. In spite of this, the method has not yet reached widespread interest. A major reason for this is that current software packages that fit SLOPE rely on algorithms that perform poorly in high dimensions. To tackle this issue, we propose a new fast algorithm to solve the SLOPE optimization problem, which combines proximal gradient descent and proximal coordinate descent steps. We provide new results on the directional derivative of the SLOPE penalty and its related SLOPE thresholding operator, as well as provide convergence guarantees for our proposed solver. In extensive benchmarks on simulated and real data, we demonstrate our method's performance against a long list of competing algorithms.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Differentially Private Stochastic Coordinate Descent
    Damaskinos, Georgios
    Mendler-Duenner, Celestine
    Guerraoui, Rachid
    Papandreou, Nikolaos
    Parnell, Thomas
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7176 - 7184
  • [22] Parallel coordinate descent for the Adaboost problem
    Fercoq, Olivier
    2013 12TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2013), VOL 1, 2013, : 354 - 358
  • [23] Inexact Coordinate Descent: Complexity and Preconditioning
    Rachael Tappenden
    Peter Richtárik
    Jacek Gondzio
    Journal of Optimization Theory and Applications, 2016, 170 : 144 - 176
  • [24] Inexact Coordinate Descent: Complexity and Preconditioning
    Tappenden, Rachael
    Richtarik, Peter
    Gondzio, Jacek
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2016, 170 (01) : 144 - 176
  • [25] ACCELERATED, PARALLEL, AND PROXIMAL COORDINATE DESCENT
    Fercoq, Olivier
    Richtarik, Peter
    SIAM JOURNAL ON OPTIMIZATION, 2015, 25 (04) : 1997 - 2023
  • [26] Alternating Randomized Block Coordinate Descent
    Diakonikolas, Jelena
    Orecchia, Lorenzo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [27] Accelerating Greedy Coordinate Descent Methods
    Lu, Haihao
    Freund, Robert M.
    Mirrokni, Vahab
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [28] Markov chain block coordinate descent
    Tao Sun
    Yuejiao Sun
    Yangyang Xu
    Wotao Yin
    Computational Optimization and Applications, 2020, 75 : 35 - 61
  • [29] Randomness and permutations in coordinate descent methods
    Mert Gürbüzbalaban
    Asuman Ozdaglar
    Nuri Denizcan Vanli
    Stephen J. Wright
    Mathematical Programming, 2020, 181 : 349 - 376
  • [30] Randomness and permutations in coordinate descent methods
    Gurbuzbalaban, Mert
    Ozdaglar, Asuman
    Vanli, Nuri Denizcan
    Wright, Stephen J.
    MATHEMATICAL PROGRAMMING, 2020, 181 (02) : 349 - 376