Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization

被引:6
|
作者
Hien, Le Thi Khanh [1 ]
Nguyen, Cuong, V [2 ]
Xu, Huan [3 ]
Lu, Canyi [4 ]
Feng, Jiashi [1 ]
机构
[1] Natl Univ Singapore, Singapore, Singapore
[2] Univ Cambridge, Cambridge, England
[3] Georgia Inst Technol, Atlanta, GA 30332 USA
[4] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
关键词
Acceleration techniques; Mirror descent method; Inexact proximal point; Composite optimization;
D O I
10.1007/s10957-018-01469-5
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few methods support the non-strongly convex case. Adding a small quadratic regularization is a common devise used to tackle non-strongly convex problems; however, it may cause loss of sparsity of solutions or weaken the performance of the algorithms. Avoiding this devise, we propose an accelerated randomized mirror descent method for solving this problem without the strongly convex assumption. Our method extends the deterministic accelerated proximal gradient methods of Paul Tseng and can be applied, even when proximal points are computed inexactly. We also propose a scheme for solving the problem, when the component functions are non-smooth.
引用
收藏
页码:541 / 566
页数:26
相关论文
共 50 条
  • [1] Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization
    Le Thi Khanh Hien
    Cuong V. Nguyen
    Huan Xu
    Canyi Lu
    Jiashi Feng
    Journal of Optimization Theory and Applications, 2019, 181 : 541 - 566
  • [2] Adaptive Mirror Descent Algorithms for Convex and Strongly Convex Optimization Problems with Functional Constraints
    Stonyakin F.S.
    Alkousa M.
    Stepanov A.N.
    Titov A.A.
    Journal of Applied and Industrial Mathematics, 2019, 13 (03) : 557 - 574
  • [3] FAST DISTRIBUTED COORDINATE DESCENT FOR NON-STRONGLY CONVEX LOSSES
    Fercoq, Olivier
    Qu, Zheng
    Richtarik, Peter
    Takac, Martin
    2014 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2014,
  • [4] Accelerated proximal incremental algorithm schemes for non-strongly convex functions
    Panahi, Ashkan
    Chehreghani, Morteza Haghir
    Dubhashi, Devdatt
    THEORETICAL COMPUTER SCIENCE, 2020, 812 : 203 - 213
  • [5] Distributed Stochastic Optimization with Compression for Non-Strongly Convex Objectives
    Li, Xuanjie
    Xu, Yuedong
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2024, 139 (01): : 459 - 481
  • [6] An aggressive reduction on the complexity of optimization for non-strongly convex objectives
    Luo, Zhijian
    Chen, Siyu
    Hou, Yueen
    Gao, Yanzeng
    Qian, Yuntao
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2023, 21 (05)
  • [7] Optimal distributed stochastic mirror descent for strongly convex optimization
    Yuan, Deming
    Hong, Yiguang
    Ho, Daniel W. C.
    Jiang, Guoping
    AUTOMATICA, 2018, 90 : 196 - 203
  • [8] Methodology and first-order algorithms for solving nonsmooth and non-strongly convex bilevel optimization problems
    Lior Doron
    Shimrit Shtern
    Mathematical Programming, 2023, 201 : 521 - 558
  • [9] Methodology and first-order algorithms for solving nonsmooth and non-strongly convex bilevel optimization problems
    Doron, Lior
    Shtern, Shimrit
    MATHEMATICAL PROGRAMMING, 2023, 201 (1-2) : 521 - 558
  • [10] Linear convergence of first order methods for non-strongly convex optimization
    I. Necoara
    Yu. Nesterov
    F. Glineur
    Mathematical Programming, 2019, 175 : 69 - 107