High probability guarantees for stochastic convex optimization

被引:0
|
作者
Davis, Damek [1 ]
Drusvyatskiy, Dmitriy [2 ]
机构
[1] Cornell Univ, Sch ORIE, Ithaca, NY 14850 USA
[2] Univ Washington, Dept Math, Seattle, WA 98195 USA
来源
关键词
Proximal point; robust distance estimation; stochastic approximation; empirical risk; APPROXIMATION ALGORITHMS; COMPOSITE OPTIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Standard results in stochastic convex optimization bound the number of samples that an algorithm needs to generate a point with small function value in expectation. More nuanced high probability guarantees are rare, and typically either rely on "light-tail" noise assumptions or exhibit worse sample complexity. In this work, we show that a wide class of stochastic optimization algorithms for strongly convex problems can be augmented with high confidence bounds at an overhead cost that is only logarithmic in the confidence level and polylogarithmic in the condition number. The procedure we propose, called proxBoost, is elementary and builds on two well-known ingredients: robust distance estimation and the proximal point method. We discuss consequences for both streaming (online) algorithms and offline algorithms based on empirical risk minimization.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] From Low Probability to High Confidence in Stochastic Convex Optimization
    Davis, Damek
    Drusvyatskiy, Dmitriy
    Xiao, Lin
    Zhang, Junyu
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [2] From low probability to high confidence in stochastic convex optimization
    Davis, Damek
    Drusvyatskiy, Dmitriy
    Xiao, Lin
    Zhang, Junyu
    1600, Microtome Publishing (22):
  • [3] High-probability bounds for Non-Convex Stochastic Optimization with Heavy Tails
    Cutkosky, Ashok
    Mehta, Harsh
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Stochastic optimization under time drift: iterate averaging, step decay, and high-probability guarantees
    Cutler, Joshua
    Drusvyatskiy, Dmitriy
    Harchaoui, Zaid
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [5] High Probability Guarantees for Nonconvex Stochastic Gradient Descent with Heavy Tails
    Li, Shaojie
    Liu, Yong
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] High Probability Guarantees for Submodular Maximization via Boosted Stochastic Greedy
    Castillo J, Andres C.
    Kaya, Ege C.
    Hashemi, Abolfazl
    FIFTY-SEVENTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, IEEECONF, 2023, : 602 - 606
  • [7] Parallel Algorithms and Probability of Large Deviation for Stochastic Convex Optimization Problems
    Dvurechensky, P. E.
    Gasnikov, A. V.
    Lagunovskaya, A. A.
    NUMERICAL ANALYSIS AND APPLICATIONS, 2018, 11 (01) : 33 - 37
  • [8] General Procedure to Provide High-Probability Guarantees for Stochastic Saddle Point Problems
    Li, Dongyang
    Li, Haobin
    Zhang, Junyu
    JOURNAL OF SCIENTIFIC COMPUTING, 2024, 100 (01)
  • [9] High probability bounds on AdaGrad for constrained weakly convex optimization
    Hong, Yusu
    Lin, Junhong
    JOURNAL OF COMPLEXITY, 2025, 86
  • [10] High-Probability Complexity Bounds for Non-smooth Stochastic Convex Optimization with Heavy-Tailed Noise
    Gorbunov, Eduard
    Danilova, Marina
    Shibaev, Innokentiy
    Dvurechensky, Pavel
    Gasnikov, Alexander
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 203 (03) : 2679 - 2738