Asynchronous Decentralized Accelerated Stochastic Gradient Descent

被引:6
|
作者
Lan G. [1 ]
Zhou Y. [2 ]
机构
[1] Department of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, 30332, GA
[2] AI Security and Privacy Solution, IBM Almaden Research Center, San Jose, 95120, CA
来源
Zhou, Yi (yi.zhou@ibm.com) | 1600年 / Institute of Electrical and Electronics Engineers Inc.卷 / 02期
关键词
decentralized control; Distributed algorithms; gradient methods; machine learning algorithms;
D O I
10.1109/JSAIT.2021.3080256
中图分类号
学科分类号
摘要
In this paper, we introduce an asynchronous decentralized accelerated stochastic gradient descent type of algorithm for decentralized stochastic optimization. Considering communication and synchronization costs are the major bottlenecks for decentralized optimization, we attempt to reduce these costs from an algorithmic design aspect, in particular, we are able to reduce the number of agents involved in one round of update via randomization. Our major contribution is to develop a class of accelerated randomized decentralized algorithms for solving general convex composite problems. We establish mathcal O(1ϵ (resp., mathcal Osqrtϵ) communication complexity and mathcal O(1ϵ2 (resp., O(1ϵ2) sampling complexity for solving general convex (resp., strongly convex) problems. It worths mentioning that our proposing algorithm only sublinear depends on the Lipschitz constant if there is a smooth component presented in the objective function. Moreover, we also conduct some preliminary numerical experiments to demonstrate the advantages of our proposing algorithms over the state-of-the-art synchronous decentralized algorithm. © 2020 IEEE.
引用
收藏
页码:802 / 811
页数:9
相关论文
共 50 条
  • [1] Asynchronous Decentralized Parallel Stochastic Gradient Descent
    Lian, Xiangru
    Zhang, Wei
    Zhang, Ce
    Liu, Ji
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [2] Asynchronous Stochastic Gradient Descent Over Decentralized Datasets
    Du, Yubo
    You, Keyou
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2021, 8 (03): : 1212 - 1224
  • [3] Asynchronous Stochastic Gradient Descent over Decentralized Datasets
    Du, Yubo
    You, Keyou
    Mo, Yilin
    2020 IEEE 16TH INTERNATIONAL CONFERENCE ON CONTROL & AUTOMATION (ICCA), 2020, : 216 - 221
  • [4] Decentralized Asynchronous Stochastic Gradient Descent: Convergence Rate Analysis
    Bedi, Amrit Singh
    Pradhan, Hrusikesha
    Rajawat, Ketan
    2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS (SPCOM 2018), 2018, : 402 - 406
  • [5] HogWild plus plus : A New Mechanism for Decentralized Asynchronous Stochastic Gradient Descent
    Zhang, Huan
    Hsieh, Cho-Jui
    Akella, Venkatesh
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 629 - 638
  • [6] Stochastic modified equations for the asynchronous stochastic gradient descent
    An, Jing
    Lu, Jianfeng
    Ying, Lexing
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2020, 9 (04) : 851 - 873
  • [7] Asynchronous Stochastic Gradient Descent with Delay Compensation
    Zheng, Shuxin
    Meng, Qi
    Wang, Taifeng
    Chen, Wei
    Yu, Nenghai
    Ma, Zhi-Ming
    Liu, Tie-Yan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [8] ASYNCHRONOUS STOCHASTIC GRADIENT DESCENT FOR DNN TRAINING
    Zhang, Shanshan
    Zhang, Ce
    You, Zhao
    Zheng, Rong
    Xu, Bo
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 6660 - 6663
  • [9] Practical Efficiency of Asynchronous Stochastic Gradient Descent
    Bhardwaj, Onkar
    Cong, Guojing
    PROCEEDINGS OF 2016 2ND WORKSHOP ON MACHINE LEARNING IN HPC ENVIRONMENTS (MLHPC), 2016, : 56 - 62
  • [10] A(DP)2SGD: Asynchronous Decentralized Parallel Stochastic Gradient Descent With Differential Privacy
    Xu, Jie
    Zhang, Wei
    Wang, Fei
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (11) : 8036 - 8047