Accelerated Bregman proximal gradient methods for relatively smooth convex optimization

被引:19
|
作者
Hanzely, Filip [1 ,2 ]
Richtarik, Peter [1 ,3 ]
Xiao, Lin [4 ]
机构
[1] King Abdullah Univ Sci & Technol KAUST, Div Comp Elect & Math Sci & Engn CEMSE, Thuwal, Saudi Arabia
[2] Toyota Technol Inst Chicago TTIC, Chicago, IL USA
[3] Moscow Inst Phys & Technol, Dolgoprudnyi, Russia
[4] Microsoft Res, Redmond, WA 98052 USA
关键词
Convex optimization; Relative smoothness; Bregman divergence; Proximal gradient methods; Accelerated gradient methods; 1ST-ORDER METHODS; MINIMIZATION ALGORITHM; DESIGNS;
D O I
10.1007/s10589-021-00273-8
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We consider the problem of minimizing the sum of two convex functions: one is differentiable and relatively smooth with respect to a reference convex function, and the other can be nondifferentiable but simple to optimize. We investigate a triangle scaling property of the Bregman distance generated by the reference convex function and present accelerated Bregman proximal gradient (ABPG) methods that attain an O(k(-gamma)) convergence rate, where gamma is an element of (0, 2] is the triangle scaling exponent (TSE) of the Bregman distance. For the Euclidean distance, we have gamma = 2 and recover the convergence rate of Nesterov's accelerated gradient methods. For non-Euclidean Bregman distances, the TSE can be much smaller (say gamma <= 1), but we show that a relaxed definition of intrinsic TSE is always equal to 2. We exploit the intrinsic TSE to develop adaptive ABPG methods that converge much faster in practice. Although theoretical guarantees on a fast convergence rate seem to be out of reach in general, our methods obtain empirical O(k(-2)) rates in numerical experiments on several applications and provide posterior numerical certificates for the fast rates.
引用
收藏
页码:405 / 440
页数:36
相关论文
共 50 条
  • [1] Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
    Filip Hanzely
    Peter Richtárik
    Lin Xiao
    Computational Optimization and Applications, 2021, 79 : 405 - 440
  • [2] Approximate bregman proximal gradient algorithm for relatively smooth nonconvex optimization
    Takahashi, Shota
    Takeda, Akiko
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2025, 90 (01) : 227 - 256
  • [3] A DUAL BREGMAN PROXIMAL GRADIENT METHOD FOR RELATIVELY-STRONGLY CONVEX OPTIMIZATION
    Liu, Jin-Zan
    Liu, Xin-Wei
    NUMERICAL ALGEBRA CONTROL AND OPTIMIZATION, 2022, 12 (04): : 679 - 692
  • [4] Relatively accelerated stochastic gradient algorithm for a class of non-smooth convex optimization problem
    Zhang, Wenjuan
    Feng, Xiangchu
    Xiao, Feng
    Huang, Shujuan
    Li, Huan
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (03): : 147 - 157
  • [5] CONTRACTING PROXIMAL METHODS FOR SMOOTH CONVEX OPTIMIZATION
    Doikov, Nikita
    Nesterov, Yurii
    SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (04) : 3146 - 3169
  • [6] Inexact Proximal Gradient Methods for Non-Convex and Non-Smooth Optimization
    Gu, Bin
    Wang, De
    Huo, Zhouyuan
    Huang, Heng
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3093 - 3100
  • [7] A NOTE ON THE (ACCELERATED) PROXIMAL GRADIENT METHOD FOR COMPOSITE CONVEX OPTIMIZATION
    Li, Qingjing
    Tan, Li
    Guo, Ke
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2022, 23 (12) : 2847 - 2857
  • [8] ROBUST ACCELERATED GRADIENT METHODS FOR SMOOTH STRONGLY CONVEX FUNCTIONS
    Aybat, Necdet Serhat
    Fallah, Alireza
    Gurbuzbalaban, Mert
    Ozdaglar, Asuman
    SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (01) : 717 - 751
  • [9] Interior gradient and proximal methods for convex and conic optimization
    Auslender, A
    Teboulle, M
    SIAM JOURNAL ON OPTIMIZATION, 2006, 16 (03) : 697 - 725
  • [10] Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Nonconvex Optimization
    Mukkamala, Mahesh Chandra
    Ochs, Peter
    Pock, Thomas
    Sabach, Shoham
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (03): : 658 - 682