Fastest rates for stochastic mirror descent methods

被引:10
|
作者
Hanzely, Filip [1 ]
Richtarik, Peter [1 ,2 ]
机构
[1] King Abdullah Univ Sci & Technol KAUST, Thuwal, Saudi Arabia
[2] Moscow Inst Phys & Technol MIPT, Dolgoprudnyi, Russia
关键词
Gradient descent; Relative smoothness; Coordinate descent; Stochastic gradient descent; COORDINATE DESCENT;
D O I
10.1007/s10589-021-00284-5
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Relative smoothness-a notion introduced in Birnbaum et al. (Proceedings of the 12th ACM conference on electronic commerce, ACM, pp 127-136, 2011) and recently rediscovered in Bauschke et al. (Math Oper Res 330-348, 2016) and Lu et al. (Relatively-smooth convex optimization by first-order methods, and applications, , 2016)-generalizes the standard notion of smoothness typically used in the analysis of gradient type methods. In this work we are taking ideas from well studied field of stochastic convex optimization and using them in order to obtain faster algorithms for minimizing relatively smooth functions. We propose and analyze two new algorithms: Relative Randomized Coordinate Descent (relRCD) and Relative Stochastic Gradient Descent (relSGD), both generalizing famous algorithms in the standard smooth setting. The methods we propose can be in fact seen as particular instances of stochastic mirror descent algorithms, which has been usually analyzed under stronger assumptions: Lipschitzness of the objective and strong convexity of the reference function. As a consequence, one of the proposed methods, relRCD corresponds to the first stochastic variant of mirror descent algorithm with linear convergence rate.
引用
收藏
页码:717 / 766
页数:50
相关论文
共 50 条
  • [1] Fastest rates for stochastic mirror descent methods
    Filip Hanzely
    Peter Richtárik
    Computational Optimization and Applications, 2021, 79 : 717 - 766
  • [2] STOCHASTIC BLOCK MIRROR DESCENT METHODS FOR NONSMOOTH AND STOCHASTIC OPTIMIZATION
    Dang, Cong D.
    Lan, Guanghui
    SIAM JOURNAL ON OPTIMIZATION, 2015, 25 (02) : 856 - 881
  • [3] An Accelerated Stochastic Mirror Descent Method
    Jiang, Bo-Ou
    Yuan, Ya-Xiang
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2024, 12 (03) : 549 - 571
  • [4] Policy Optimization with Stochastic Mirror Descent
    Yang, Long
    Zhang, Yu
    Zheng, Gang
    Zheng, Qian
    Li, Pengfei
    Huang, Jianghang
    Pan, Gang
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 8823 - 8831
  • [5] Stochastic Mirror Descent on Overparameterized Nonlinear Models
    Azizan, Navid
    Lale, Sahin
    Hassibi, Babak
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (12) : 7717 - 7727
  • [6] Adaptive Stochastic Mirror Descent for Constrained Optimization
    Bayandina, Anastasia
    2017 CONSTRUCTIVE NONSMOOTH ANALYSIS AND RELATED TOPICS (DEDICATED TO THE MEMORY OF V.F. DEMYANOV) (CNSA), 2017, : 40 - 43
  • [7] Efficiently Solving MDPs with Stochastic Mirror Descent
    Jin, Yujia
    Sidford, Aaron
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [8] Convergence of the Iterates in Mirror Descent Methods
    Doan, Thinh T.
    Bose, Subhonmesh
    Nguyen, D. Hoa
    Beck, Carolyn L.
    IEEE CONTROL SYSTEMS LETTERS, 2019, 3 (01): : 114 - 119
  • [9] A Stochastic Interpretation of Stochastic Mirror Descent: Risk-Sensitive Optimality
    Azizan, Navid
    Hassibi, Babak
    2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC), 2019, : 3960 - 3965
  • [10] Stochastic incremental mirror descent algorithms with Nesterov smoothing
    Bitterlich, Sandy
    Grad, Sorin-Mihai
    NUMERICAL ALGORITHMS, 2024, 95 (01) : 351 - 382