Performance of noisy Nesterov's accelerated method for strongly convex optimization problems

被引:9
|
作者
Mohammadi, Hesameddin [1 ]
Razaviyayn, Meisam [2 ]
Jovanovic, Mihailo R. [1 ]
机构
[1] Univ Southern Calif, Dept Elect & Comp Engn, Los Angeles, CA 90089 USA
[2] Univ Southern Calif, Dept Ind & Syst Engn, Los Angeles, CA 90089 USA
来源
2019 AMERICAN CONTROL CONFERENCE (ACC) | 2019年
基金
美国国家科学基金会;
关键词
Accelerated first-order algorithms; control for optimization; convex optimization; integral quadratic constraints; linear matrix inequalities; Nesterov's method; noise amplification; second-order moments; semidefinite programming; GRADIENT; ALGORITHMS;
D O I
10.23919/acc.2019.8814680
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study the performance of noisy gradient descent and Nesterov's accelerated methods for strongly convex objective functions with Lipschitz continuous gradients. The steady-state second -order moment of the error in the iterates is analyzed when the gradient is perturbed by an additive white noise with zero mean and identity covariance. For any given condition number kappa, we derive explicit upper bounds on noise amplification that only depend on kappa and the problem size. We use quadratic objective functions to derive lower bounds and to demonstrate that the upper bounds are tight up to a constant factor. The established upper bound for Nesterov's accelerated method is larger than the upper bound for gradient descent by a factor of root kappa. This gap identifies a fundamental tradeoff that comes with acceleration in the presence of stochastic uncertainties in the gradient evaluation.
引用
收藏
页码:3426 / 3431
页数:6
相关论文
共 50 条
  • [41] Improving Neural Ordinary Differential Equations with Nesterov's Accelerated Gradient Method
    Nguyen, Nghia H.
    Nguyen, Tan M.
    Vo, Huyen K.
    Osher, Stanley J.
    Vo, Thieu N.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [42] Vaidya's Method for Convex Stochastic Optimization Problems in Small Dimension
    Gladin, E. L.
    Gasnikov, A., V
    Ermakova, E. S.
    MATHEMATICAL NOTES, 2022, 112 (1-2) : 183 - 190
  • [43] Vaidya’s Method for Convex Stochastic Optimization Problems in Small Dimension
    E. L. Gladin
    A. V. Gasnikov
    E. S. Ermakova
    Mathematical Notes, 2022, 112 : 183 - 190
  • [44] Generating Nesterov's accelerated gradient algorithm by using optimal control theory for optimization
    Ross, I. M.
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2023, 423
  • [45] A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
    Su, Weijie
    Boyd, Stephen
    Candes, Emmanuel J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [46] A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
    Su, Weijie
    Boyd, Stephen
    Candes, Emmanuel J.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [47] A Neural Network approach to Analog Circuit Design Optimization using Nesterov's Accelerated Quasi-Newton Method
    Indrapriyadarsini, S.
    Mahboubi, Shahrzad
    Ninomiya, Hiroshi
    Kamio, Takeshi
    Asai, Hideki
    2020 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2020,
  • [48] A second-order method for strongly convex -regularization problems
    Fountoulakis, Kimon
    Gondzio, Jacek
    MATHEMATICAL PROGRAMMING, 2016, 156 (1-2) : 189 - 219
  • [49] A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA
    Florea, Mihai I.
    Vorobyov, Sergiy A.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 (68) : 3033 - 3048
  • [50] On Online Optimization: Dynamic Regret Analysis of Strongly Convex and Smooth Problems
    Chang, Ting-Jui
    Shahrampour, Shahin
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 6966 - 6973