CONVERGENCE PROPERTIES OF PROXIMAL (SUB)GRADIENT METHODS WITHOUT CONVEXITY OR SMOOTHNESS OF ANY OF THE FUNCTIONS

被引:0
|
作者
Solodov, Mikhail, V [1 ]
机构
[1] IMPA Inst Matemat Pura & Aplicada, Estrada Dona Castronia, BR-22460320 Rio De Janeiro, RJ, Brazil
关键词
proximal gradient methods; incremental methods; nonsmooth nonconvex optimization; MINIMIZATION; OPTIMIZATION; ALGORITHMS; NONCONVEX;
D O I
10.1137/23M1592158
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We establish convergence properties for a framework that includes a variety of proximal subgradient methods, where none of the involved functions needs to be convex or differentiable. The functions are assumed to be Clarke-regular. Our results cover the projected and conditional variants for the constrained case, the use of the inertial/momentum terms, and incremental methods when each of the functions is itself a sum, and the methods process the components in this sum separately.
引用
收藏
页码:28 / 41
页数:14
相关论文
共 50 条
  • [1] Convergence Properties of Monotone and Nonmonotone Proximal Gradient Methods Revisited
    Christian Kanzow
    Patrick Mehlitz
    Journal of Optimization Theory and Applications, 2022, 195 : 624 - 646
  • [2] Convergence Properties of Monotone and Nonmonotone Proximal Gradient Methods Revisited
    Kanzow, Christian
    Mehlitz, Patrick
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2022, 195 (02) : 624 - 646
  • [3] On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity
    Heinz H. Bauschke
    Jérôme Bolte
    Jiawei Chen
    Marc Teboulle
    Xianfu Wang
    Journal of Optimization Theory and Applications, 2019, 182 : 1068 - 1087
  • [4] On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity
    Bauschke, Heinz H.
    Bolte, Jerome
    Chen, Jiawei
    Teboulle, Marc
    Wang, Xianfu
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2019, 182 (03) : 1068 - 1087
  • [5] Non-asymptotic convergence analysis of inexact gradient methods for machine learning without strong convexity
    So, Anthony Man-Cho
    Zhou, Zirui
    OPTIMIZATION METHODS & SOFTWARE, 2017, 32 (04): : 963 - 992
  • [6] On smoothness properties of optimal value functions at the boundary of their domain under complete convexity
    Stein, Oliver
    Sudermann-Merx, Nathan
    MATHEMATICAL METHODS OF OPERATIONS RESEARCH, 2014, 79 (03) : 327 - 352
  • [7] On smoothness properties of optimal value functions at the boundary of their domain under complete convexity
    Oliver Stein
    Nathan Sudermann-Merx
    Mathematical Methods of Operations Research, 2014, 79 : 327 - 352
  • [8] GLOBAL CONVERGENCE RATE OF PROXIMAL INCREMENTAL AGGREGATED GRADIENT METHODS
    Vanli, N. D.
    Gurbuzbalaban, M.
    Ozdaglar, A.
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) : 1282 - 1300
  • [9] On the Convergence of Proximal Gradient Methods for Convex Simple Bilevel Optimization
    Latafat, Puya
    Themelis, Andreas
    Villa, Silvia
    Patrinos, Panagiotis
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2025, 204 (03)
  • [10] CONVERGENCE RATES OF PROXIMAL GRADIENT METHODS VIA THE CONVEX CONJUGATE
    Gutman, David H.
    Pena, Javier F.
    SIAM JOURNAL ON OPTIMIZATION, 2019, 29 (01) : 162 - 174