First-Order Methods for Convex Optimization

被引:12
|
作者
Dvurechensky, Pavel [1 ,2 ,3 ]
Shtern, Shimrit [4 ]
Staudigl, Mathias [5 ,6 ]
机构
[1] Weierstrass Inst Appl Anal & Stochast, Mohrenstr 39, D-10117 Berlin, Germany
[2] Inst Informat Transmiss Problems RAS, Bolshoy Karetny Per 19,Build 1, Moscow 127051, Russia
[3] Moscow Inst Phys & Technol, 9 Inst Skiy Per, Dolgoprudnyi 141701, Moscow Region, Russia
[4] Technion Israel Inst Technol, Fac Ind Engn & Management, Haifa, Israel
[5] Maastricht Univ, Dept Data Sci & Knowledge Engn DKE, Paul Henri Spaaklaan 1, NL-6229 EN Maastricht, Netherlands
[6] Maastricht Univ, Math Ctr Maastricht MCM, Paul Henri Spaaklaan 1, NL-6229 EN Maastricht, Netherlands
关键词
Convex Optimization; Composite Optimization; First-Order Methods; Numerical Algorithms; Convergence Rate; Proximal Mapping; Proximity Operator; Bregman Divergence; STOCHASTIC COMPOSITE OPTIMIZATION; PROJECTED SUBGRADIENT METHODS; INTERMEDIATE GRADIENT-METHOD; COORDINATE DESCENT METHODS; VARIATIONAL-INEQUALITIES; MIRROR DESCENT; FRANK-WOLFE; APPROXIMATION ALGORITHMS; THRESHOLDING ALGORITHM; MINIMIZATION ALGORITHM;
D O I
10.1016/j.ejco.2021.100015
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
First-order methods for solving convex optimization problems have been at the forefront of mathematical optimization in the last 20 years. The rapid development of this important class of algorithms is motivated by the success stories reported in various applications, including most importantly machine learning, signal processing, imaging and control theory. First-order methods have the potential to provide low accuracy solutions at low computational complexity which makes them an attractive set of tools in large-scale optimization problems. In this survey, we cover a number of key developments in gradient-based optimization methods. This includes non-Euclidean extensions of the classical proximal gradient method, and its accelerated versions. Additionally we survey recent developments within the class of projection-free methods, and proximal versions of primal dual schemes. We give complete proofs for various key results, and highlight the unifying aspects of several optimization algorithms.
引用
收藏
页数:27
相关论文
共 50 条
  • [1] Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization
    Devarakonda, Aditya
    Demmel, James
    Fountoulakis, Kimon
    Mahoney, Michael W.
    2018 32ND IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS), 2018, : 409 - 418
  • [2] Fast First-Order Methods for Composite Convex Optimization with Backtracking
    Scheinberg, Katya
    Goldfarb, Donald
    Bai, Xi
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2014, 14 (03) : 389 - 417
  • [3] RELATIVELY SMOOTH CONVEX OPTIMIZATION BY FIRST-ORDER METHODS, AND APPLICATIONS
    Lu, Haihao
    Freund, Robert M.
    Nesterov, Yurii
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (01) : 333 - 354
  • [4] Fast First-Order Methods for Composite Convex Optimization with Backtracking
    Katya Scheinberg
    Donald Goldfarb
    Xi Bai
    Foundations of Computational Mathematics, 2014, 14 : 389 - 417
  • [5] First-order methods of smooth convex optimization with inexact oracle
    Olivier Devolder
    François Glineur
    Yurii Nesterov
    Mathematical Programming, 2014, 146 : 37 - 75
  • [6] First-order methods of smooth convex optimization with inexact oracle
    Devolder, Olivier
    Glineur, Francois
    Nesterov, Yurii
    MATHEMATICAL PROGRAMMING, 2014, 146 (1-2) : 37 - 75
  • [7] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Digvijay Boob
    Qi Deng
    Guanghui Lan
    Mathematical Programming, 2023, 197 : 215 - 279
  • [8] Accelerated First-order Methods for Geodesically Convex Optimization on Riemannian Manifolds
    Liu, Yuanyuan
    Shang, Fanhua
    Cheng, James
    Cheng, Hong
    Jiao, Licheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [9] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Boob, Digvijay
    Deng, Qi
    Lan, Guanghui
    MATHEMATICAL PROGRAMMING, 2023, 197 (01) : 215 - 279
  • [10] From differential equation solvers to accelerated first-order methods for convex optimization
    Hao Luo
    Long Chen
    Mathematical Programming, 2022, 195 : 735 - 781