Multi-Consensus Decentralized Accelerated Gradient Descent

被引:0
|
作者
Ye, Haishan [1 ]
Luo, Luo [2 ]
Zhou, Ziang [3 ]
Zhang, Tong [4 ]
机构
[1] Xi An Jiao Tong Univ, Ctr Intelligent Decis Making & Machine Learning, Sch Management, Xian, Peoples R China
[2] Fudan Univ, Sch Data Sci, Shanghai, Peoples R China
[3] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Peoples R China
[4] Hong Kong Univ Sci & Technol, Comp Sci & Math, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
consensus optimization; decentralized algorithm; accelerated gradient descent; gradient tracking; composite optimization; DISTRIBUTED OPTIMIZATION; LINEAR CONVERGENCE; ALGORITHMS; COMMUNICATION; ITERATIONS; EXTRA;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers the decentralized convex optimization problem, which has a wide range of applications in large-scale machine learning, sensor networks, and control theory. We propose novel algorithms that achieve optimal computation complexity and near optimal communication complexity. Our theoretical results give affirmative answers to the open problem on whether there exists an algorithm that can achieve a communication complexity (nearly) matching the lower bound depending on the global condition number instead of the local one. Furthermore, the linear convergence of our algorithms only depends on the strong convexity of global objective and it does not require the local functions to be convex. The design of our methods relies on a novel integration of well-known techniques including Nesterov's acceleration, multi-consensus and gradient-tracking. Empirical studies show the outperformance of our methods for machine learning applications.
引用
收藏
页数:50
相关论文
共 50 条
  • [31] Decentralized Stochastic Compositional Gradient Descent for AUPRC Maximization
    Gao, Hongchang
    Duan, Yubin
    Zhang, Yihan
    Wu, Jie
    PROCEEDINGS OF THE 2024 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2024, : 226 - 234
  • [32] On the Convergence of Decentralized Stochastic Gradient Descent With Biased Gradients
    Jiang, Yiming
    Kang, Helei
    Liu, Jinlan
    Xu, Dongpo
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2025, 73 : 549 - 558
  • [33] Asynchronous Stochastic Gradient Descent Over Decentralized Datasets
    Du, Yubo
    You, Keyou
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2021, 8 (03): : 1212 - 1224
  • [34] Asynchronous Stochastic Gradient Descent over Decentralized Datasets
    Du, Yubo
    You, Keyou
    Mo, Yilin
    2020 IEEE 16TH INTERNATIONAL CONFERENCE ON CONTROL & AUTOMATION (ICCA), 2020, : 216 - 221
  • [35] Byzantine-Resilient Decentralized Stochastic Gradient Descent
    Guo, Shangwei
    Zhang, Tianwei
    Yu, Han
    Xie, Xiaofei
    Ma, Lei
    Xiang, Tao
    Liu, Yang
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (06) : 4096 - 4106
  • [36] On the Decentralized Stochastic Gradient Descent With Markov Chain Sampling
    Sun, Tao
    Li, Dongsheng
    Wang, Bao
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 2895 - 2909
  • [37] Adaptive Random Walk Gradient Descent for Decentralized Optimization
    Sun, Tao
    Li, Dongsheng
    Wang, Bao
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [38] DSPG: Decentralized Simultaneous Perturbations Gradient Descent Scheme
    Ramaswamy, Arunselvan
    2020 28TH EUROMICRO INTERNATIONAL CONFERENCE ON PARALLEL, DISTRIBUTED AND NETWORK-BASED PROCESSING (PDP 2020), 2020, : 54 - 62
  • [39] Authors' reply to 'comments on "On multi-consensus and almost equitable graph partitions"
    Monaco, S.
    Celsi, L. Ricciardi
    AUTOMATICA, 2024, 167
  • [40] Multi-consensus of multi-agent systems with various intelligences using switched impulsive protocols
    Han, Guang-Song
    He, Ding-Xin
    Guan, Zhi-Hong
    Hu, Bin
    Li, Tao
    Liao, Rui-Quan
    INFORMATION SCIENCES, 2016, 349 : 188 - 198