Communication-Efficient Federated DNN Training: Convert, Compress, Correct

被引:0
|
作者
Chen, Zhong-Jing [1 ]
Hernandez, Eduin E. [2 ]
Huang, Yu-Chih [1 ]
Rini, Stefano [1 ]
机构
[1] Natl Yang Ming Chiao Tung Univ, Inst Commun Engn, Hsinchu 30010, Taiwan
[2] Natl Yang Ming Chiao Tung Univ, Dept Elect & Elect Engn, Hsinchu 30010, Taiwan
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 24期
关键词
Training; Quantization (signal); Artificial neural networks; Convergence; Vectors; Stochastic processes; Optimization; Deep neural network (DNN) training; error feedback (EF); federated learning (FL); gradient compression; gradient modeling;
D O I
10.1109/JIOT.2024.3456857
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the federated training of a deep neural network (DNN), model updates are transmitted from the remote users to the parameter server (PS). In many scenarios of practical relevance, one is interested in reducing the communication overhead to enhance training efficiency. To address this challenge, we introduce CO3 . CO3 takes its name from three processing applied which reduce the communication load when transmitting the local DNN gradients from the remote users to the PS. Namely, 1) gradient quantization through floating-point conversion; 2) lossless compression of the quantized gradient; and 3) correction of quantization error. We carefully design each of the steps above to ensure good training performance under a constraint on the communication rate. In particular, in steps 1) and 2), we adopt the assumption that DNN gradients are distributed according to a generalized normal distribution, which is validated numerically in this article. For step 3), we utilize an error feedback with a memory decay mechanism to correct the quantization error introduced in step 1). We argue that the memory decay coefficient -similar to the learning rate-can be optimally tuned to improve convergence. A rigorous convergence analysis of the proposed CO3 with stochastic gradient descent (SGD) is provided. Moreover, with extensive simulations, we show that CO3 offers improved performance as compared with existing gradient compression schemes proposed in the literature which employ sketching and nonuniform quantization of the local gradients.
引用
收藏
页码:40431 / 40447
页数:17
相关论文
共 50 条
  • [31] Communication-Efficient Wireless Traffic Prediction with Federated Learning
    Gao, Fuwei
    Zhang, Chuanting
    Qiao, Jingping
    Li, Kaiqiang
    Cao, Yi
    MATHEMATICS, 2024, 12 (16)
  • [32] Communication-Efficient Consensus Mechanism for Federated Reinforcement Learning
    Xu, Xing
    Li, Rongpeng
    Zhao, Zhifeng
    Zhang, Honggang
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 80 - 85
  • [33] Communication-Efficient Federated Learning With Binary Neural Networks
    Yang, Yuzhi
    Zhang, Zhaoyang
    Yang, Qianqian
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3836 - 3850
  • [34] Communication-efficient Federated Learning with Cooperative Filter Selection
    Yang, Zhao
    Sun, Qingshuang
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2172 - 2176
  • [35] Communication-Efficient Federated Learning with Adaptive Consensus ADMM
    He, Siyi
    Zheng, Jiali
    Feng, Minyu
    Chen, Yixin
    APPLIED SCIENCES-BASEL, 2023, 13 (09):
  • [36] Communication-Efficient Federated Learning With Gradual Layer Freezing
    Malan, Erich
    Peluso, Valentino
    Calimera, Andrea
    Macii, Enrico
    IEEE EMBEDDED SYSTEMS LETTERS, 2023, 15 (01) : 25 - 28
  • [37] Communication-Efficient Federated Learning Based on Compressed Sensing
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (20) : 15531 - 15541
  • [38] Communication-Efficient Semihierarchical Federated Analytics in IoT Networks
    Zhao, Liang
    Valero, Maria
    Pouriyeh, Seyedamin
    Li, Lei
    Sheng, Quan Z.
    IEEE INTERNET OF THINGS JOURNAL, 2021, 9 (14) : 12614 - 12627
  • [39] On the Convergence of Communication-Efficient Local SGD for Federated Learning
    Gao, Hongchang
    Xu, An
    Huang, Heng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7510 - 7518
  • [40] A Cooperative Analysis to Incentivize Communication-Efficient Federated Learning
    Li, Youqi
    Li, Fan
    Yang, Song
    Zhang, Chuan
    Zhu, Liehuang
    Wang, Yu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 10175 - 10190