Performance of noisy three-step accelerated first-order optimization algorithms for strongly convex quadratic problems

被引:0
|
作者
Samuelson, Samantha [1 ]
Mohammadi, Hesameddin [1 ]
Jovanovic, Mihailo R. [1 ]
机构
[1] Univ Southern Calif, Dept Elect & Comp Engn, Los Angeles, CA 90089 USA
关键词
Convex optimization; gradient descent; heavy-ball method; Nesterov's accelerated algorithms; noisy gradients; performance tradeoffs;
D O I
10.1109/CDC49753.2023.10383581
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study the class of first-order algorithms in which the optimization variable is updated using information from three previous iterations. While two-step momentum algorithms akin to heavy-ball and Nesterov's accelerated methods achieve the optimal convergence rate, it is an open question if the three-step momentum method can offer advantages for problems in which exact gradients are not available. For strongly convex quadratic problems, we identify algorithmic parameters which achieve the optimal convergence rate and examine how additional momentum terms affects the tradeoffs between acceleration and noise amplification. Our results suggest that for parameters that optimize the convergence rate, introducing additional momentum terms does not provide improvement in variance amplification relative to standard accelerated algorithms.
引用
收藏
页码:1300 / 1305
页数:6
相关论文
共 43 条
  • [21] FOM - a MATLAB toolbox of first-order methods for solving convex optimization problems
    Beck, Amir
    Guttmann-Beck, Nili
    OPTIMIZATION METHODS & SOFTWARE, 2019, 34 (01): : 172 - 193
  • [22] On Solving Large-Scale Polynomial Convex Problems by Randomized First-Order Algorithms
    Ben-Tal, Aharon
    Nemirovski, Arkadi
    MATHEMATICS OF OPERATIONS RESEARCH, 2015, 40 (02) : 474 - 494
  • [23] First-Order Algorithms Converge Faster than O(1/k) on Convex Problems
    Lee, Ching-pei
    Wright, Stephen J.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [24] A distributed stochastic first-order method for strongly concave-convex saddle point problems
    Qureshi, Muhammad, I
    Khan, Usman A.
    2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 4170 - 4175
  • [25] First-Order Algorithms for Robust Optimization Problems via Convex-Concave Saddle-Point Lagrangian Reformulation
    Postek, Krzysztof
    Shtern, Shimrit
    INFORMS JOURNAL ON COMPUTING, 2024,
  • [26] High-Resolution Modeling of the Fastest First-Order Optimization Method for Strongly Convex Functions
    Sun, Boya
    George, Jemin
    Kia, Solmaz
    2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2020, : 4237 - 4242
  • [27] Smooth strongly convex interpolation and exact worst-case performance of first-order methods
    Taylor, Adrien B.
    Hendrickx, Julien M.
    Glineur, Francois
    MATHEMATICAL PROGRAMMING, 2017, 161 (1-2) : 307 - 345
  • [28] Smooth strongly convex interpolation and exact worst-case performance of first-order methods
    Adrien B. Taylor
    Julien M. Hendrickx
    François Glineur
    Mathematical Programming, 2017, 161 : 307 - 345
  • [29] An accelerated first-order method for solving SOS relaxations of unconstrained polynomial optimization problems
    Bertsimas, Dimitris
    Freund, Robert M.
    Sun, Xu Andy
    OPTIMIZATION METHODS & SOFTWARE, 2013, 28 (03): : 424 - 441
  • [30] Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles
    Criscitiello, Christopher
    Boumal, Nicolas
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178 : 496 - 542