Communication-Efficient Federated DNN Training: Convert, Compress, Correct

被引:0
|
作者
Chen, Zhong-Jing [1 ]
Hernandez, Eduin E. [2 ]
Huang, Yu-Chih [1 ]
Rini, Stefano [1 ]
机构
[1] Natl Yang Ming Chiao Tung Univ, Inst Commun Engn, Hsinchu 30010, Taiwan
[2] Natl Yang Ming Chiao Tung Univ, Dept Elect & Elect Engn, Hsinchu 30010, Taiwan
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 24期
关键词
Training; Quantization (signal); Artificial neural networks; Convergence; Vectors; Stochastic processes; Optimization; Deep neural network (DNN) training; error feedback (EF); federated learning (FL); gradient compression; gradient modeling;
D O I
10.1109/JIOT.2024.3456857
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the federated training of a deep neural network (DNN), model updates are transmitted from the remote users to the parameter server (PS). In many scenarios of practical relevance, one is interested in reducing the communication overhead to enhance training efficiency. To address this challenge, we introduce CO3 . CO3 takes its name from three processing applied which reduce the communication load when transmitting the local DNN gradients from the remote users to the PS. Namely, 1) gradient quantization through floating-point conversion; 2) lossless compression of the quantized gradient; and 3) correction of quantization error. We carefully design each of the steps above to ensure good training performance under a constraint on the communication rate. In particular, in steps 1) and 2), we adopt the assumption that DNN gradients are distributed according to a generalized normal distribution, which is validated numerically in this article. For step 3), we utilize an error feedback with a memory decay mechanism to correct the quantization error introduced in step 1). We argue that the memory decay coefficient -similar to the learning rate-can be optimally tuned to improve convergence. A rigorous convergence analysis of the proposed CO3 with stochastic gradient descent (SGD) is provided. Moreover, with extensive simulations, we show that CO3 offers improved performance as compared with existing gradient compression schemes proposed in the literature which employ sketching and nonuniform quantization of the local gradients.
引用
收藏
页码:40431 / 40447
页数:17
相关论文
共 50 条
  • [41] FedDQ: Communication-Efficient Federated Learning with Descending Quantization
    Qu, Linping
    Song, Shenghui
    Tsui, Chi-Ying
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 281 - 286
  • [42] Communication-Efficient Generalized Neuron Matching for Federated Learning
    Hu, Sixu
    Li, Qinbin
    He, Bingsheng
    PROCEEDINGS OF THE 52ND INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2023, 2023, : 254 - 263
  • [43] Communication-Efficient Federated Distillation with Active Data Sampling
    Liu, Lumin
    Zhang, Jun
    Song, S. H.
    Letaief, Khaled B.
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 201 - 206
  • [44] Communication-efficient and Scalable Decentralized Federated Edge Learning
    Yapp, Austine Zong Han
    Koh, Hong Soo Nicholas
    Lai, Yan Ting
    Kang, Jiawen
    Li, Xuandi
    Ng, Jer Shyuan
    Jiang, Hongchao
    Lim, Wei Yang Bryan
    Xiong, Zehui
    Niyato, Dusit
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 5032 - 5035
  • [45] Communication-Efficient Federated Learning with Adaptive Parameter Freezing
    Chen, Chen
    Xu, Hong
    Wang, Wei
    Li, Baochun
    Li, Bo
    Chen, Li
    Zhang, Gong
    2021 IEEE 41ST INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2021), 2021, : 1 - 11
  • [46] FedCS: Communication-Efficient Federated Learning with Compressive Sensing
    Liu, Ye
    Chang, Shan
    Liu, Yiqi
    2022 IEEE 28TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, ICPADS, 2022, : 17 - 24
  • [47] Communication-Efficient and Personalized Federated Lottery Ticket Learning
    Seo, Sejin
    Ko, Seung-Woo
    Park, Jihong
    Kim, Seong-Lyun
    Bennis, Mehdi
    SPAWC 2021: 2021 IEEE 22ND INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC 2021), 2020, : 581 - 585
  • [48] Communication-efficient federated learning via personalized filter pruning
    Min, Qi
    Luo, Fei
    Dong, Wenbo
    Gu, Chunhua
    Ding, Weichao
    INFORMATION SCIENCES, 2024, 678
  • [49] Communication-Efficient Personalized Federated Learning for Green Communications in IoMT
    Chen, Ziqi
    Du, Jun
    Jiang, Chunxiao
    Lu, Yunlong
    Han, Zhu
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 1521 - 1526
  • [50] Dynamic Sampling and Selective Masking for Communication-Efficient Federated Learning
    Ji, Shaoxiong
    Jiang, Wenqi
    Walid, Anwar
    Li, Xue
    IEEE INTELLIGENT SYSTEMS, 2022, 37 (02) : 27 - 34