Communication-Efficient Distributed SGD With Compressed Sensing

被引:4
|
作者
Tang, Yujie [1 ]
Ramanathan, Vikram [1 ]
Zhang, Junshan [2 ]
Li, Na [1 ]
机构
[1] Harvard Univ, Sch Engn & Appl Sci, Allston, MA 02134 USA
[2] Arizona State Univ, Sch Elect Comp & Energy Engn, Tempe, AZ 85287 USA
来源
IEEE CONTROL SYSTEMS LETTERS | 2022年 / 6卷
基金
美国国家科学基金会;
关键词
Servers; Compressed sensing; Sensors; Stochastic processes; Sparse matrices; Optimization; Convergence; Optimization algorithms; large-scale systems; distributed optimization; compressed sensing; CONVERGENCE;
D O I
10.1109/LCSYS.2021.3137859
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider large scale distributed optimization over a set of edge devices connected to a central server, where the limited communication bandwidth between the server and edge devices imposes a significant bottleneck for the optimization procedure. Inspired by recent advances in federated learning, we propose a distributed stochastic gradient descent (SGD) type algorithm that exploits the sparsity of the gradient, when possible, to reduce communication burden. At the heart of the algorithm is to use compressed sensing techniques for the compression of the local stochastic gradients at the device side; and at the server side, a sparse approximation of the global stochastic gradient is recovered from the noisy aggregated compressed local gradients. We conduct theoretical analysis on the convergence of our algorithm in the presence of noise perturbation incurred by the communication channels, and also conduct numerical experiments to corroborate its effectiveness.
引用
收藏
页码:2054 / 2059
页数:6
相关论文
共 50 条
  • [41] Communication-Efficient Distributed Mining of Association Rules
    Assaf Schuster
    Ran Wolff
    Data Mining and Knowledge Discovery, 2004, 8 : 171 - 196
  • [42] Communication-Efficient Distributed PCA by Riemannian Optimization
    Huang, Long-Kai
    Pan, Sinno Jialin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [43] Double Quantization for Communication-Efficient Distributed Optimization
    Huang, Longbo
    PROCEEDINGS OF THE 13TH EAI INTERNATIONAL CONFERENCE ON PERFORMANCE EVALUATION METHODOLOGIES AND TOOLS ( VALUETOOLS 2020), 2020, : 2 - 2
  • [44] Communication-Efficient Distributed Dual Coordinate Ascent
    Jaggi, Martin
    Smith, Virginia
    Takac, Martin
    Terhorst, Jonathan
    Krishnan, Sanjay
    Hofmann, Thomas
    Jordan, Michael, I
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [45] Communication-Efficient Distributed Optimization with Quantized Preconditioners
    Alimisis, Foivos
    Davies, Peter
    Alistarh, Dan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [46] Communication-Efficient Δ-Stepping for Distributed Computing Systems
    Zhang, Haomeng
    Xie, Junfei
    Zhang, Xinyu
    2023 19TH INTERNATIONAL CONFERENCE ON WIRELESS AND MOBILE COMPUTING, NETWORKING AND COMMUNICATIONS, WIMOB, 2023, : 369 - 374
  • [47] Communication-efficient distributed mining of association rules
    Schuster, A
    Wolff, R
    SIGMOD RECORD, 2001, 30 (02) : 473 - 484
  • [48] Communication-efficient Massively Distributed Connected Components
    Lamm, Sebastian
    Sanders, Peter
    2022 IEEE 36TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS 2022), 2022, : 302 - 312
  • [49] More communication-efficient distributed sparse learning
    Zhou, Xingcai
    Yang, Guang
    INFORMATION SCIENCES, 2024, 668
  • [50] Communication-efficient Conformal Prediction for Distributed Datasets
    Riquelme-Granada, Nery
    Luo, Zhiyuan
    Khuong An Nguyen
    CONFORMAL AND PROBABILISTIC PREDICTION WITH APPLICATIONS, VOL 179, 2022, 179