Convergence rate of incremental subgradient algorithms

被引:0
|
作者
Nedic, A [1 ]
Bertsekas, D [1 ]
机构
[1] MIT, Cambridge, MA 02139 USA
关键词
nondifferentiable optimization; convex programming; incremental subgradient methods; stochastic subgradient methods;
D O I
暂无
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to perform the subgradient iteration incrementally, by sequentially taking steps along the subgradients of the component functions, with intermediate adjustment of the variables after processing each component function. This incremental approach has been very successful in solving large differentiable least squares problems, such as those arising in the training of neural networks, and it has resulted in a much better practical rate of convergence than the steepest descent method. In this paper, we present convergence results and estimates of the convergence rate of a number of variants of incremental subgradient methods, including some that use randomization. The convergence rate estimates are consistent with our computational results, and suggests that the randomized variants perform substantially better than their deterministic counterparts.
引用
收藏
页码:223 / 264
页数:42
相关论文
共 50 条
  • [1] ON THE CONVERGENCE RATE OF INCREMENTAL AGGREGATED GRADIENT ALGORITHMS
    Gurbuzbalaban, M.
    Ozdaglar, A.
    Parrilo, P. A.
    SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (02) : 1035 - 1048
  • [2] INCREMENTAL STOCHASTIC SUBGRADIENT ALGORITHMS FOR CONVEX OPTIMIZATION
    Ram, S. Sundhar
    Nedic, A.
    Veeravalli, V. V.
    SIAM JOURNAL ON OPTIMIZATION, 2009, 20 (02) : 691 - 717
  • [3] Convergence of approximate and incremental subgradient methods for convex optimization
    Kiwiel, KC
    SIAM JOURNAL ON OPTIMIZATION, 2004, 14 (03) : 807 - 840
  • [4] String-averaging incremental stochastic subgradient algorithms
    Oliveira, R. M.
    Helou, E. S.
    Costa, E. F.
    OPTIMIZATION METHODS & SOFTWARE, 2019, 34 (03): : 665 - 692
  • [5] Subgradient-Push Is of the Optimal Convergence Rate
    Lin, Yixuan
    Liu, Ji
    2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 5849 - 5856
  • [6] ON THE CONVERGENCE OF BROADCAST INCREMENTAL ALGORITHMS WITH APPLICATIONS
    Liu, Liya
    Petrusel, Adrian
    Qin, Xiaolong
    Yao, Jen-Chih
    FIXED POINT THEORY, 2024, 25 (02): : 635 - 666
  • [7] Incremental subgradient algorithms with dynamic step sizes for separable convex optimizations
    Yang, Dan
    Wang, Xiangmei
    MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2023, 46 (06) : 7108 - 7124
  • [8] CONVERGENCE ANALYSIS OF INCREMENTAL AND PARALLEL LINE SEARCH SUBGRADIENT METHODS IN HILBERT SPACE
    Hishinuma, Kazuhiro
    Iiduka, Hideaki
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2019, 20 (09) : 1937 - 1947
  • [9] CONVERGENCE RATE OF INCREMENTAL GRADIENT AND INCREMENTAL NEWTON METHODS
    Gurbuzbalaban, M.
    Ozdaglar, A.
    Parrilo, P. A.
    SIAM JOURNAL ON OPTIMIZATION, 2019, 29 (04) : 2542 - 2565
  • [10] Convergence Rate and Convergence of Genetic Algorithms
    LIU Feng
    LIU Guizhong
    ZHANG Zhuosheng(Institute for Information Engineering
    University
    Journal of Systems Science and Systems Engineering, 1999, (01) : 73 - 81