Convergence rate of incremental subgradient algorithms

被引:0
|
作者
Nedic, A [1 ]
Bertsekas, D [1 ]
机构
[1] MIT, Cambridge, MA 02139 USA
关键词
nondifferentiable optimization; convex programming; incremental subgradient methods; stochastic subgradient methods;
D O I
暂无
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to perform the subgradient iteration incrementally, by sequentially taking steps along the subgradients of the component functions, with intermediate adjustment of the variables after processing each component function. This incremental approach has been very successful in solving large differentiable least squares problems, such as those arising in the training of neural networks, and it has resulted in a much better practical rate of convergence than the steepest descent method. In this paper, we present convergence results and estimates of the convergence rate of a number of variants of incremental subgradient methods, including some that use randomization. The convergence rate estimates are consistent with our computational results, and suggests that the randomized variants perform substantially better than their deterministic counterparts.
引用
收藏
页码:223 / 264
页数:42
相关论文
共 50 条
  • [21] Average Convergence Rate of Evolutionary Algorithms
    He, Jun
    Lin, Guangming
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2016, 20 (02) : 316 - 321
  • [22] Rate of convergence of Thresholding Greedy Algorithms
    Temlyakov, V. N.
    SBORNIK MATHEMATICS, 2024, 215 (02) : 275 - 289
  • [23] ON THE RATE OF CONVERGENCE OF 2 MINIMAX ALGORITHMS
    WIEST, EJ
    POLAK, E
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1991, 71 (01) : 1 - 30
  • [24] Rate of convergence of pure greedy algorithms
    Livshits, ED
    MATHEMATICAL NOTES, 2004, 76 (3-4) : 497 - 510
  • [25] On convergence properties of a subgradient method
    Konnov, IV
    OPTIMIZATION METHODS & SOFTWARE, 2003, 18 (01): : 53 - 62
  • [26] A Markovian Incremental Stochastic Subgradient Algorithm
    Massambone, Rafael
    Costa, Eduardo Fontoura
    Helou, Elias Salomao
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2023, 68 (01) : 124 - 139
  • [27] An Incremental Subgradient Method on Riemannian Manifolds
    Peng Zhang
    Gejun Bao
    Journal of Optimization Theory and Applications, 2018, 176 : 711 - 727
  • [28] Accelerating the convergence of subgradient optimisation
    Baker, BM
    Sheasby, J
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 1999, 117 (01) : 136 - 144
  • [29] Incremental subgradient methods for nondifferentiable optimization
    Nedic, A
    Bertsekas, DP
    SIAM JOURNAL ON OPTIMIZATION, 2001, 12 (01) : 109 - 138
  • [30] Accelerating the convergence of subgradient optimisation
    Sch. of Math. and Info. Sciences, Coventry University, Priory Street, Coventry CV1 5FB, United Kingdom
    Eur J Oper Res, 1 (136-144):