A Simple Nearly Optimal Restart Scheme For Speeding Up First-Order Methods

被引:0
|
作者
James Renegar
Benjamin Grimmer
机构
[1] Cornell University,School of Operations Research and Information Engineering
关键词
First-order method; Restarting; Convex optimization; Parallelization; Convergence rates; 90C25; 90C52;
D O I
暂无
中图分类号
学科分类号
摘要
We present a simple scheme for restarting first-order methods for convex optimization problems. Restarts are made based only on achieving specified decreases in objective values, the specified amounts being the same for all optimization problems. Unlike existing restart schemes, the scheme makes no attempt to learn parameter values characterizing the structure of an optimization problem, nor does it require any special information that would not be available in practice (unless the first-order method chosen to be employed in the scheme itself requires special information). As immediate corollaries to the main theorems, we show that when some well-known first-order methods are employed in the scheme, the resulting complexity bounds are nearly optimal for particular—yet quite general—classes of problems.
引用
收藏
页码:211 / 256
页数:45
相关论文
共 50 条
  • [1] A Simple Nearly Optimal Restart Scheme For Speeding Up First-Order Methods
    Renegar, James
    Grimmer, Benjamin
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2022, 22 (01) : 211 - 256
  • [2] Structure Exploitation of Practical MPC Formulations for Speeding up First-Order Methods
    Kufoalor, D. K. M.
    Richter, S.
    Imsland, L.
    Johansen, T. A.
    2017 IEEE 56TH ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2017,
  • [3] An acceleration procedure for optimal first-order methods
    Baes, Michel
    Buergisser, Michael
    OPTIMIZATION METHODS & SOFTWARE, 2014, 29 (03): : 610 - 628
  • [4] Optimal complexity and certification of Bregman first-order methods
    Dragomir, Radu-Alexandru
    Taylor, Adrien B.
    D'Aspremont, Alexandre
    Bolte, Jerome
    MATHEMATICAL PROGRAMMING, 2022, 194 (1-2) : 41 - 83
  • [5] Optimal complexity and certification of Bregman first-order methods
    Radu-Alexandru Dragomir
    Adrien B. Taylor
    Alexandre d’Aspremont
    Jérôme Bolte
    Mathematical Programming, 2022, 194 : 41 - 83
  • [6] Restarts Subject to Approximate Sharpness: A Parameter-Free and Optimal Scheme For First-Order Methods
    Adcock, Ben
    Colbrook, Matthew J.
    Neyra-Nesterenko, Maksym
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2025,
  • [7] Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach
    Ito, Masaru
    Fukuda, Mituhiro
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2021, 188 (03) : 770 - 804
  • [8] Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach
    Masaru Ito
    Mituhiro Fukuda
    Journal of Optimization Theory and Applications, 2021, 188 : 770 - 804
  • [9] On optimal universal first-order methods for minimizing heterogeneous sums
    Grimmer, Benjamin
    OPTIMIZATION LETTERS, 2024, 18 (02) : 427 - 445
  • [10] On optimal universal first-order methods for minimizing heterogeneous sums
    Benjamin Grimmer
    Optimization Letters, 2024, 18 : 427 - 445