Linear convergence of first order methods for non-strongly convex optimization

被引:0
|
作者
I. Necoara
Yu. Nesterov
F. Glineur
机构
[1] University Politehnica Bucharest,Automatic Control and Systems Engineering Department
[2] Universite catholique de Louvain,Center for Operations Research and Econometrics
来源
Mathematical Programming | 2019年 / 175卷
关键词
90C25; 90C06; 65K05;
D O I
暂无
中图分类号
学科分类号
摘要
The standard assumption for proving linear convergence of first order methods for smooth convex optimization is the strong convexity of the objective function, an assumption which does not hold for many practical applications. In this paper, we derive linear convergence rates of several first order methods for solving smooth non-strongly convex constrained optimization problems, i.e. involving an objective function with a Lipschitz continuous gradient that satisfies some relaxed strong convexity condition. In particular, in the case of smooth constrained convex optimization, we provide several relaxations of the strong convexity conditions and prove that they are sufficient for getting linear convergence for several first order methods such as projected gradient, fast gradient and feasible descent methods. We also provide examples of functional classes that satisfy our proposed relaxations of strong convexity conditions. Finally, we show that the proposed relaxed strong convexity conditions cover important applications ranging from solving linear systems, Linear Programming, and dual formulations of linearly constrained convex problems.
引用
收藏
页码:69 / 107
页数:38
相关论文
共 50 条
  • [31] Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
    Zhou, Pan
    Yuan, Xiao-Tong
    Yan, Shuicheng
    Feng, Jiashi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (02) : 459 - 472
  • [32] Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
    Zhou, Pan
    Yuan, Xiao-Tong
    Feng, Jiashi
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 138 - 147
  • [33] Linearly convergent away-step conditional gradient for non-strongly convex functions
    Amir Beck
    Shimrit Shtern
    Mathematical Programming, 2017, 164 : 1 - 27
  • [34] SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
    Defazio, Aaron
    Bach, Francis
    Lacoste-Julien, Simon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [35] On the Convergence of Inexact Projection Primal First-Order Methods for Convex Minimization
    Patrascu, Andrei
    Necoara, Ion
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2018, 63 (10) : 3317 - 3329
  • [36] Linear Convergence of Consensus-Based Quantized Optimization for Smooth and Strongly Convex Cost Functions
    Kajiyama, Yuichi
    Hayashi, Naoki
    Takai, Shigemasa
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2021, 66 (03) : 1254 - 1261
  • [37] Alternating minimization methods for strongly convex optimization
    Tupitsa, Nazarii
    Dvurechensky, Pavel
    Gasnikov, Alexander
    Guminov, Sergey
    JOURNAL OF INVERSE AND ILL-POSED PROBLEMS, 2021, 29 (05): : 721 - 739
  • [38] Optimization of mechanical systems: On non-linear first-order approximation with an additive convex term
    Kegl, M
    Oblak, MM
    COMMUNICATIONS IN NUMERICAL METHODS IN ENGINEERING, 1997, 13 (01): : 13 - 20
  • [39] First-order methods of smooth convex optimization with inexact oracle
    Olivier Devolder
    François Glineur
    Yurii Nesterov
    Mathematical Programming, 2014, 146 : 37 - 75
  • [40] First-order methods of smooth convex optimization with inexact oracle
    Devolder, Olivier
    Glineur, Francois
    Nesterov, Yurii
    MATHEMATICAL PROGRAMMING, 2014, 146 (1-2) : 37 - 75