Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization

被引:6
|
作者
Hien, Le Thi Khanh [1 ]
Nguyen, Cuong, V [2 ]
Xu, Huan [3 ]
Lu, Canyi [4 ]
Feng, Jiashi [1 ]
机构
[1] Natl Univ Singapore, Singapore, Singapore
[2] Univ Cambridge, Cambridge, England
[3] Georgia Inst Technol, Atlanta, GA 30332 USA
[4] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
关键词
Acceleration techniques; Mirror descent method; Inexact proximal point; Composite optimization;
D O I
10.1007/s10957-018-01469-5
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few methods support the non-strongly convex case. Adding a small quadratic regularization is a common devise used to tackle non-strongly convex problems; however, it may cause loss of sparsity of solutions or weaken the performance of the algorithms. Avoiding this devise, we propose an accelerated randomized mirror descent method for solving this problem without the strongly convex assumption. Our method extends the deterministic accelerated proximal gradient methods of Paul Tseng and can be applied, even when proximal points are computed inexactly. We also propose a scheme for solving the problem, when the component functions are non-smooth.
引用
收藏
页码:541 / 566
页数:26
相关论文
共 50 条
  • [31] Convergence Results of a Nested Decentralized Gradient Method for Non-strongly Convex Problems
    Choi, Woocheol
    Kim, Doheon
    Yun, Seok-Bae
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2022, 195 (01) : 172 - 204
  • [32] Convergence Results of a Nested Decentralized Gradient Method for Non-strongly Convex Problems
    Woocheol Choi
    Doheon Kim
    Seok-Bae Yun
    Journal of Optimization Theory and Applications, 2022, 195 : 172 - 204
  • [33] Accelerated Distributed Nesterov Gradient Descent for Smooth and Strongly Convex Functions
    Qu, Guannan
    Li, Na
    2016 54TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2016, : 209 - 216
  • [34] An accelerated randomized Bregman-Kaczmarz method for strongly convex linearly constraint optimization
    Tondji, Lionel
    Lorenz, Dirk A.
    Necoara, Ion
    2023 EUROPEAN CONTROL CONFERENCE, ECC, 2023,
  • [35] Distributed Mirror Descent for Online Composite Optimization
    Yuan, Deming
    Hong, Yiguang
    Ho, Daniel W. C.
    Xu, Shengyuan
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2021, 66 (02) : 714 - 729
  • [36] Variance amplification of accelerated first-order algorithms for strongly convex quadratic optimization problems
    Mohammadi, Hesameddin
    Razaviyayn, Meisam
    Jovanovic, Mihailo R.
    2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 5753 - 5758
  • [37] Deterministic Coordinate Descent Algorithms for Smooth Convex Optimization
    Wu, Xuyang
    Lu, Jie
    2017 IEEE 56TH ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2017,
  • [38] A Weighted Mirror Descent Algorithm for Nonsmooth Convex Optimization Problem
    Luong, Duy V. N.
    Parpas, Panos
    Rueckert, Daniel
    Rustem, Berc
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2016, 170 (03) : 900 - 915
  • [39] Mirror descent and nonlinear projected subgradient methods for convex optimization
    Beck, A
    Teboulle, M
    OPERATIONS RESEARCH LETTERS, 2003, 31 (03) : 167 - 175
  • [40] Online convex optimization using coordinate descent algorithms
    Lin, Yankai
    Shames, Iman
    Nesic, Dragan
    AUTOMATICA, 2024, 165