Revisiting Optimal Convergence Rate for Smooth and Non-convex Stochastic Decentralized Optimization

被引:0
|
作者
Yuan, Kun [1 ,3 ]
Huang, Xinmeng [2 ]
Chen, Yiming [1 ,4 ]
Zhang, Xiaohan [2 ]
Zhang, Yingya [1 ]
Pan, Pan [1 ]
机构
[1] DAMO Acad, Alibaba Grp, Beijing, Peoples R China
[2] Univ Penn, Philadelphia, PA 19104 USA
[3] Peking Univ, Beijing, Peoples R China
[4] MetaCarbon, Beijing, Peoples R China
关键词
DISTRIBUTED OPTIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Decentralized optimization is effective to save communication in large-scale machine learning. Although numerous algorithms have been proposed with theoretical guarantees and empirical successes, the performance limits in decentralized optimization, especially the influence of network topology and its associated weight matrix on the optimal convergence rate, have not been fully understood. While Lu and Sa [44] have recently provided an optimal rate for non-convex stochastic decentralized optimization with weight matrices defined over linear graphs, the optimal rate with general weight matrices remains unclear. This paper revisits non-convex stochastic decentralized optimization and establishes an optimal convergence rate with general weight matrices. In addition, we also establish the optimal rate when non-convex loss functions further satisfy the Polyak-Lojasiewicz (PL) condition. Following existing lines of analysis in literature cannot achieve these results. Instead, we leverage the Ring-Lattice graph to admit general weight matrices while maintaining the optimal relation between the graph diameter and weight matrix connectivity. Lastly, we develop a new decentralized algorithm to nearly attain the above two optimal rates under additional mild conditions.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Efficient Smooth Non-Convex Stochastic Compositional Optimization via Stochastic Recursive Gradient Descent
    Hu, Wenqing
    Li, Chris Junchi
    Lian, Xiangru
    Liu, Ji
    Yuan, Huizhuo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [22] Natasha: Faster Non-Convex Stochastic Optimization via Strongly Non-Convex Parameter
    Allen-Zhu, Zeyuan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [23] On Graduated Optimization for Stochastic Non-Convex Problems
    Hazan, Elad
    Levy, Kfir Y.
    Shalev-Shwartz, Shai
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [24] Lower bounds for non-convex stochastic optimization
    Yossi Arjevani
    Yair Carmon
    John C. Duchi
    Dylan J. Foster
    Nathan Srebro
    Blake Woodworth
    Mathematical Programming, 2023, 199 : 165 - 214
  • [25] Lower bounds for non-convex stochastic optimization
    Arjevani, Yossi
    Carmon, Yair
    Duchi, John C.
    Foster, Dylan J.
    Srebro, Nathan
    Woodworth, Blake
    MATHEMATICAL PROGRAMMING, 2023, 199 (1-2) : 165 - 214
  • [26] Convergence of a Multi-Agent Projected Stochastic Gradient Algorithm for Non-Convex Optimization
    Bianchi, Pascal
    Jakubowicz, Jeremie
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2013, 58 (02) : 391 - 405
  • [27] Time-Average Stochastic Optimization with Non-convex Decision Set and its Convergence
    Supittayapornpong, Sucha
    Neely, Michael J.
    2015 13th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt), 2015, : 490 - 497
  • [28] DECENTRALIZED STOCHASTIC NON-CONVEX OPTIMIZATION OVER WEAKLY CONNECTED TIME-VARYING DIGRAPHS
    Lu, Songtao
    Wu, Chai Wah
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 5770 - 5774
  • [29] Global convergence of a curvilinear search for non-convex optimization
    Bartholomew-Biggs, Michael
    Beddiaf, Salah
    Christianson, Bruce
    NUMERICAL ALGORITHMS, 2023, 92 (04) : 2025 - 2043
  • [30] Uniform Convergence of Gradients for Non-Convex Learning and Optimization
    Foster, Dylan J.
    Sekhari, Ayush
    Sridharan, Karthik
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31