SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

被引:0
|
作者
Defazio, Aaron [1 ]
Bach, Francis [2 ]
Lacoste-Julien, Simon [2 ]
机构
[1] Australian Natl Univ, Ambiata, Canberra, ACT, Australia
[2] Ecole Normale Super, Sierra Project Team, INRIA, Paris, France
基金
欧洲研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem. We give experimental results showing the effectiveness of our method.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] A DUAL BREGMAN PROXIMAL GRADIENT METHOD FOR RELATIVELY-STRONGLY CONVEX OPTIMIZATION
    Liu, Jin-Zan
    Liu, Xin-Wei
    NUMERICAL ALGEBRA CONTROL AND OPTIMIZATION, 2022, 12 (04): : 679 - 692
  • [42] Limited memory Kelley's Method Converges for Composite Convex and Submodular Objectives
    Zhou, Song
    Gupta, Swati
    Udell, Madeleine
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [43] Adaptive Stochastic Gradient Descent Method for Convex and Non-Convex Optimization
    Chen, Ruijuan
    Tang, Xiaoquan
    Li, Xiuting
    FRACTAL AND FRACTIONAL, 2022, 6 (12)
  • [44] Fast projected gradient method for support vector machines
    Veronica Bloom
    Igor Griva
    Fabio Quijada
    Optimization and Engineering, 2016, 17 : 651 - 662
  • [45] Fast projected gradient method for support vector machines
    Bloom, Veronica
    Griva, Igor
    Quijada, Fabio
    OPTIMIZATION AND ENGINEERING, 2016, 17 (04) : 651 - 662
  • [46] Kill a Bird with Two Stones: Closing the Convergence Gaps in Non-Strongly Convex Optimization by Directly Accelerated SVRG with Double Compensation and Snapshots
    Liu, Yuanyuan
    Shang, Fanhua
    An, Weixin
    Liu, Hongying
    Lin, Zhouchen
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [47] A FAST DUAL GRADIENT METHOD FOR SEPARABLE CONVEX OPTIMIZATION VIA SMOOTHING
    Li, Jueyou
    Wu, Zhiyou
    Wu, Changzhi
    Long, Qiang
    Wang, Xiangyu
    Lee, Jae-Myung
    Jung, Kwang-Hyo
    PACIFIC JOURNAL OF OPTIMIZATION, 2016, 12 (02): : 289 - +
  • [48] A perturbation-incremental method for strongly non-linear oscillators
    Chan, HSY
    Chung, KW
    Xu, Z
    INTERNATIONAL JOURNAL OF NON-LINEAR MECHANICS, 1996, 31 (01) : 59 - 72
  • [49] Delayed Weighted Gradient Method with simultaneous step-sizes for strongly convex optimization
    Lara, Hugo
    Aleixo, Rafael
    Oviedo, Harry
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 89 (01) : 151 - 182
  • [50] MGPROX: A NONSMOOTH MULTIGRID PROXIMAL GRADIENT METHOD WITH ADAPTIVE RESTRICTION FOR STRONGLY CONVEX OPTIMIZATION
    Ang, Andersen
    de Sterck, Hans
    Vavasis, Stephen
    SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (03) : 2788 - 2820