SPARQ-SGD: Event-Triggered and Compressed Communication in Decentralized Optimization

被引:0
|
作者
Singh, Navjot [1 ]
Data, Deepesh [1 ]
George, Jemin [2 ]
Diggavi, Suhas [1 ]
机构
[1] Univ Calif Los Angeles, Dept Elect & Comp Engn, Los Angeles, CA 90095 USA
[2] US Army, Res Lab, Adelphi, MD USA
关键词
D O I
10.1109/cdc42340.2020.9303828
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose and analyze SPARQSGD, an event-triggered and compressed algorithm for decentralized training of large-scale machine learning models over a graph. Each node can locally compute a condition (event) which triggers a communication where quantized and sparsified local model parameters are sent. In SPARQ-SGD, each node first takes a fixed number of local gradient steps and then checks if the model parameters have significantly changed compared to its last update; it communicates further compressed model parameters only when there is a significant change, as specified by a (design) criterion. We prove that SPARQ-SGD converges as O(1/nT) and O(1/root nT) in the strongly-convex and non-convex settings, respectively, demonstrating that aggressive compression, including event-triggered communication, model sparsification and quantization does not affect the overall convergence rate compared to uncompressed decentralized training; thereby theoretically yielding communication efficiency for 'free'. We evaluate SPARQ-SGD over real datasets to demonstrate significant savings in communication bits over the state-of-the-art.
引用
收藏
页码:3449 / 3456
页数:8
相关论文
共 50 条
  • [1] SPARQ-SGD: Event-Triggered and Compressed Communication in Decentralized Optimization
    Singh, Navjot
    Data, Deepesh
    George, Jemin
    Diggavi, Suhas
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2023, 68 (02) : 721 - 736
  • [2] Decentralized ADMM with compressed and event-triggered communication
    Zhang, Zhen
    Yang, Shaofu
    Xu, Wenying
    NEURAL NETWORKS, 2023, 165 : 472 - 482
  • [3] Decentralized Online Convex Optimization With Event-Triggered Communications
    Cao, Xuanyu
    Basar, Tamer
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 284 - 299
  • [4] A Bayesian Optimization Approach to Decentralized Event-Triggered Control
    Hashimoto, Kazumune
    Kishida, Masako
    Yoshimura, Yuichi
    Ushio, Toshimitsu
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2021, E104A (02) : 447 - 454
  • [5] Distributed Nonconvex Optimization With Event-Triggered Communication
    Xu, Lei
    Yi, Xinlei
    Shi, Yang
    Johansson, Karl H.
    Chai, Tianyou
    Yang, Tao
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2024, 69 (04) : 2745 - 2752
  • [6] Decentralized Event-Triggered Federated Learning with Heterogeneous Communication Thresholds
    Zehtabi, Shahryar
    Hosseinalipour, Seyyedali
    Brinton, Christopher G.
    2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 4680 - 4687
  • [7] Communication Schemes for Centralized and Decentralized Event-Triggered Control Systems
    Kartakis, Sokratis
    Fu, Anqi
    Mazo, Manuel, Jr.
    Mccann, Julie A.
    IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, 2018, 26 (06) : 2035 - 2048
  • [8] Decentralized periodic event-triggered control with quantization and asynchronous communication
    Fu, Anqi
    Mazo, Manuel, Jr.
    AUTOMATICA, 2018, 94 : 294 - 299
  • [9] On the Accelerated Convergence of the Decentralized Event-triggered Algorithm for Convex Optimization
    Zhang, Keke
    Xiong, Jiang
    Dai, Xiangguang
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2021, 30 (01)
  • [10] Agent-Supervisor Coordination for Decentralized Event-Triggered Optimization
    Srivastava, Priyank
    Cavraro, Guido
    Cortes, Jorge
    IEEE CONTROL SYSTEMS LETTERS, 2022, 6 : 1970 - 1975