Global rates of convergence for nonconvex optimization on manifolds

被引:136
|
作者
Boumal, Nicolas [1 ,2 ]
Absil, P-A. [3 ]
Cartis, Coralia [4 ]
机构
[1] Princeton Univ, Dept Math, Princeton, NJ 08544 USA
[2] Princeton Univ, PACM, Princeton, NJ 08544 USA
[3] Catholic Univ Louvain, ICTEAM Inst, Louvain La Neuve, Belgium
[4] Univ Oxford, Math Inst, Oxford, England
基金
美国国家科学基金会; 英国自然环境研究理事会;
关键词
complexity; gradient descent; trust-region method; Riemannian optimization; optimization on manifolds; CONJUGATE-GRADIENT METHOD; EVALUATION COMPLEXITY; RIEMANNIAN-MANIFOLDS; RETRACTIONS; ALGORITHMS; OPTIMALITY; NEWTONS; MODEL;
D O I
10.1093/imanum/drx080
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider the minimization of a cost function f on a manifold M using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance epsilon. Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of f to the tangent spaces of M, both of these algorithms produce points with Riemannian gradient smaller than epsilon in O(1/epsilon(2)) iterations. Furthermore, RTR returns a point where also the Riemannian Hessian's least eigenvalue is larger than -epsilon in O(1/epsilon(3)) iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy epsilon (up to constants) and hence are sharp in that sense. These are the first deterministic results for global rates of convergence to approximate first- and second-order Karush-Kuhn-Tucker points on manifolds. They apply in particular for optimization constrained to compact submanifolds of R-n, under simpler assumptions.
引用
收藏
页码:1 / 33
页数:33
相关论文
共 50 条
  • [1] Global rates of convergence for nonconvex optimization on manifolds (vol 39, pg 1, 2018)
    Boumal, Nicolas
    Absil, P. -A.
    Cartis, Coralia
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2020, 40 (04) : 2940 - 2940
  • [2] Global Convergence of ADMM in Nonconvex Nonsmooth Optimization
    Yu Wang
    Wotao Yin
    Jinshan Zeng
    Journal of Scientific Computing, 2019, 78 : 29 - 63
  • [3] Global Convergence of ADMM in Nonconvex Nonsmooth Optimization
    Wang, Yu
    Yin, Wotao
    Zeng, Jinshan
    JOURNAL OF SCIENTIFIC COMPUTING, 2019, 78 (01) : 29 - 63
  • [4] GLOBAL CONVERGENCE OF SPLITTING METHODS FOR NONCONVEX COMPOSITE OPTIMIZATION
    Li, Guoyin
    Pong, Ting Kei
    SIAM JOURNAL ON OPTIMIZATION, 2015, 25 (04) : 2434 - 2460
  • [5] CONVERGENCE RATES OF INERTIAL SPLITTING SCHEMES FOR NONCONVEX COMPOSITE OPTIMIZATION
    Johnstone, Patrick R.
    Moulin, Pierre
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4716 - 4720
  • [6] On the global convergence of the BFGS method or nonconvex unconstrained optimization problems
    Li, DH
    Fukushima, M
    SIAM JOURNAL ON OPTIMIZATION, 2001, 11 (04) : 1054 - 1064
  • [7] Global Convergence of Langevin Dynamics Based Algorithms for Nonconvex Optimization
    Xu, Pan
    Chen, Jinghui
    Zou, Difan
    Gu, Quanquan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] ON THE GLOBAL CONVERGENCE OF RANDOMIZED COORDINATE GRADIENT DESCENT FOR NONCONVEX OPTIMIZATION
    Chen, Ziang
    Li, Yingzhou
    Lu, Jianfeng
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (02) : 713 - 738
  • [9] On Global Linear Convergence in Stochastic Nonconvex Optimization for Semidefinite Programming
    Zeng, Jinshan
    Ma, Ke
    Yao, Yuan
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (16) : 4261 - 4275
  • [10] Global convergence of dislocation hyperbolic augmented Lagrangian algorithm for nonconvex optimization
    Ramirez, Lennin Mallma
    Maculan, Nelson
    Xavier, Adilson Elias
    Xavier, Vinicius Layter
    OPTIMIZATION, 2024,