SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

被引:0
|
作者
Defazio, Aaron [1 ]
Bach, Francis [2 ]
Lacoste-Julien, Simon [2 ]
机构
[1] Australian Natl Univ, Ambiata, Canberra, ACT, Australia
[2] Ecole Normale Super, Sierra Project Team, INRIA, Paris, France
基金
欧洲研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem. We give experimental results showing the effectiveness of our method.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Distributed Stochastic Optimization with Compression for Non-Strongly Convex Objectives
    Li, Xuanjie
    Xu, Yuedong
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2024, 139 (01): : 459 - 481
  • [2] An aggressive reduction on the complexity of optimization for non-strongly convex objectives
    Luo, Zhijian
    Chen, Siyu
    Hou, Yueen
    Gao, Yanzeng
    Qian, Yuntao
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2023, 21 (05)
  • [3] Convergence Results of a Nested Decentralized Gradient Method for Non-strongly Convex Problems
    Choi, Woocheol
    Kim, Doheon
    Yun, Seok-Bae
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2022, 195 (01) : 172 - 204
  • [4] Convergence Results of a Nested Decentralized Gradient Method for Non-strongly Convex Problems
    Woocheol Choi
    Doheon Kim
    Seok-Bae Yun
    Journal of Optimization Theory and Applications, 2022, 195 : 172 - 204
  • [5] Detection and Isolation of Adversaries in Decentralized Optimization for Non-Strongly Convex Objectives
    Ravi, Nikhil
    Scaglione, Anna
    IFAC PAPERSONLINE, 2019, 52 (20): : 381 - 386
  • [6] Accelerated proximal incremental algorithm schemes for non-strongly convex functions
    Panahi, Ashkan
    Chehreghani, Morteza Haghir
    Dubhashi, Devdatt
    THEORETICAL COMPUTER SCIENCE, 2020, 812 : 203 - 213
  • [7] FAST DISTRIBUTED COORDINATE DESCENT FOR NON-STRONGLY CONVEX LOSSES
    Fercoq, Olivier
    Qu, Zheng
    Richtarik, Peter
    Takac, Martin
    2014 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2014,
  • [8] Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization
    Hien, Le Thi Khanh
    Nguyen, Cuong, V
    Xu, Huan
    Lu, Canyi
    Feng, Jiashi
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2019, 181 (02) : 541 - 566
  • [9] Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization
    Le Thi Khanh Hien
    Cuong V. Nguyen
    Huan Xu
    Canyi Lu
    Jiashi Feng
    Journal of Optimization Theory and Applications, 2019, 181 : 541 - 566
  • [10] Linearly convergent away-step conditional gradient for non-strongly convex functions
    Beck, Amir
    Shtern, Shimrit
    MATHEMATICAL PROGRAMMING, 2017, 164 (1-2) : 1 - 27