A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications

被引:216
|
作者
Bauschke, Heinz H. [1 ]
Bolte, Jerome [2 ]
Teboulle, Marc [3 ]
机构
[1] Univ British Columbia, Math, Kelowna, BC V1V 1V7, Canada
[2] Univ Toulouse 1 Capitole, Toulouse Sch Econ, F-31015 Toulouse, France
[3] Tel Aviv Univ, Sch Math Sci, IL-69978 Ramat Aviv, Israel
基金
以色列科学基金会;
关键词
first-order methods; composite nonsmooth convex minimization; descent lemma; proximal-gradient algorithms; complexity; Bregman distance; multiplicative Poisson linear inverse problems; MINIMIZATION ALGORITHM; CONVERGENCE ANALYSIS; MONOTONE-OPERATORS; WEAK-CONVERGENCE; CONVEX;
D O I
10.1287/moor.2016.0817
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The proximal gradient and its variants is one of the most attractive first-order algorithm for minimizing the sum of two convex functions, with one being nonsmooth. However, it requires the differentiable part of the objective to have a Lipschitz continuous gradient, thus precluding its use in many applications. In this paper we introduce a framework which allows to circumvent the intricate question of Lipschitz continuity of gradients by using an elegant and easy to check convexity condition which captures the geometry of the constraints. This condition translates into a new descent lemma which in turn leads to a natural derivation of the proximal-gradient scheme with Bregman distances. We then identify a new notion of asymmetry measure for Bregman distances, which is central in determining the relevant step-size. These novelties allow to prove a global sublinear rate of convergence, and as a by-product, global pointwise convergence is obtained. This provides a new path to a broad spectrum of problems arising in key applications which were, until now, considered as out of reach via proximal gradient methods. We illustrate this potential by showing how our results can be applied to build new and simple schemes for Poisson inverse problems.
引用
收藏
页码:330 / 348
页数:19
相关论文
共 50 条
  • [1] FIRST ORDER METHODS BEYOND CONVEXITY AND LIPSCHITZ GRADIENT CONTINUITY WITH APPLICATIONS TO QUADRATIC INVERSE PROBLEMS
    Bolte, Jerome
    Sabach, Shoham
    Teboulle, Marc
    Vaisbourd, Yakov
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (03) : 2131 - 2151
  • [2] Adaptive First-Order Methods Revisited: Convex Optimization without Lipschitz Requirements
    Antonakopoulos, Kimon
    Mertikopoulos, Panayotis
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] ACCELERATED FIRST-ORDER METHODS FOR CONVEX OPTIMIZATION WITH LOCALLY LIPSCHITZ CONTINUOUS GRADIENT
    Lu, Zhaosong
    Mei, Sanyou
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) : 2275 - 2310
  • [4] Gradient Descent in the Absence of Global Lipschitz Continuity of the Gradients
    Patel, Vivak
    Berahas, Albert S.
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2024, 6 (03): : 602 - 626
  • [5] First-Order and Second-Order Variants of the Gradient Descent in a Unified Framework
    Pierrot, Thomas
    Perrin-Gilbert, Nicolas
    Sigaud, Olivier
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT II, 2021, 12892 : 197 - 208
  • [6] Pareto Navigation Gradient Descent: a First-Order Algorithm for Optimization in Pareto Set
    Ye, Mao
    Liu, Qiang
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 2246 - 2255
  • [7] A directional Lipschitz extension lemma, with applications to uniqueness and Lagrangianity for the continuity equation
    Caravenna, Laura
    Crippa, Gianluca
    COMMUNICATIONS IN PARTIAL DIFFERENTIAL EQUATIONS, 2021, 46 (08) : 1488 - 1520
  • [8] From error bounds to the complexity of first-order descent methods for convex functions
    Bolte, Jerome
    Trong Phong Nguyen
    Peypouquet, Juan
    Suter, Bruce W.
    MATHEMATICAL PROGRAMMING, 2017, 165 (02) : 471 - 507
  • [9] From error bounds to the complexity of first-order descent methods for convex functions
    Jérôme Bolte
    Trong Phong Nguyen
    Juan Peypouquet
    Bruce W. Suter
    Mathematical Programming, 2017, 165 : 471 - 507
  • [10] First-order conditional logic revisited
    Friedman, N
    Halpern, JY
    Koller, D
    PROCEEDINGS OF THE THIRTEENTH NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE, VOLS 1 AND 2, 1996, : 1305 - 1312