SAFARI: Sparsity-Enabled Federated Learning With Limited and Unreliable Communications

被引:10
|
作者
Mao, Yuzhu [1 ]
Zhao, Zihao [1 ]
Yang, Meilin [1 ]
Liang, Le [2 ,3 ]
Liu, Yang [4 ,5 ]
Ding, Wenbo [1 ,5 ,6 ]
Lan, Tian [7 ]
Zhang, Xiao-Ping [1 ,8 ]
机构
[1] Tsinghua Univ, Tsinghua Shenzhen Int Grad Sch, Tsinghua Berkeley Shenzhen Inst, Beijing 100190, Peoples R China
[2] Southeast Univ, Natl Mobile Commun Res Lab, Nanjing, Peoples R China
[3] Purple Mt Labs, Nanjing 211111, Peoples R China
[4] Tsinghua Univ, Inst AI Ind Res AIR, Beijing 100190, Peoples R China
[5] Shanghai AI Lab, Shanghai 200241, Peoples R China
[6] RISC V Int Open Source Lab, Shenzhen 518055, Peoples R China
[7] George Washington Univ, Dept Elect & Comp Engn, Washington, DC 20052 USA
[8] Ryerson Univ, Dept Elect Comp & Biomed Engn, Toronto, ON M5B 2K3, Canada
基金
国家重点研发计划;
关键词
Training; Reliability; Computational modeling; Servers; Convergence; Data models; Federated learning; Distributed networks; federated learning; unreliable communication; model sparsification;
D O I
10.1109/TMC.2023.3296624
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) enables edge devices to collaboratively learn a model in a distributed fashion. Many existing researches have focused on improving communication efficiency of high-dimensional models and addressing bias caused by local updates. However, most FL algorithms are either based on reliable communications or assuming fixed and known unreliability characteristics. In practice, networks could suffer from dynamic channel conditions and non-deterministic disruptions, with time-varying and unknown characteristics. To this end, in this paper we propose a sparsity-enabled FL framework with both improved communication efficiency and bias reduction, termed as SAFARI. It makes use of similarity among client models to rectify and compensate for bias that results from unreliable communications. More precisely, sparse learning is implemented on local clients to mitigate communication overhead, while to cope with unreliable communications, a similarity-based compensation method is proposed to provide surrogates for missing model updates. With respect to sparse models, we analyze SAFARI under bounded dissimilarity. It is demonstrated that SAFARI under unreliable communications is guaranteed to converge at the same rate as the standard FedAvg with perfect communications. Implementations and evaluations on the CIFAR-10 dataset validate the effectiveness of SAFARI by showing that it can achieve the same convergence speed and accuracy as FedAvg with perfect communications, with up to 60% of the model weights being pruned and a high percentage of client updates missing in each round of model updates.
引用
收藏
页码:4819 / 4831
页数:13
相关论文
共 50 条
  • [41] Blockchain-Enabled Federated Learning With Mechanism Design
    Toyoda, Kentaroh
    Zhao, Jun
    Zhang, Allan Neng Sheng
    Mathiopoulos, P. Takis
    IEEE ACCESS, 2020, 8 : 219744 - 219756
  • [42] V2V Communications Using Blockchain-Enabled 6G Technology and Federated Learning
    Ahmed, Tahir H.
    Tiang, Jun Jiat
    Mahmud, Azwan
    Dinh-Thuan Do
    Truong Tran
    Mumtaz, Shahid
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1302 - 1307
  • [43] On Model Transmission Strategies in Federated Learning With Lossy Communications
    Su, Xiaoxin
    Zhou, Yipeng
    Cui, Laizhong
    Liu, Jiangchuan
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (04) : 1173 - 1185
  • [44] Design and Analysis of Uplink and Downlink Communications for Federated Learning
    Zheng, Sihui
    Shen, Cong
    Chen, Xiang
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [45] Design and Analysis of Uplink and Downlink Communications for Federated Learning
    Zheng, Sihui
    Shen, Cong
    Chen, Xiang
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (07) : 2150 - 2167
  • [46] Optimal Rate Adaption in Federated Learning with Compressed Communications
    Cui, Laizhong
    Su, Xiaoxin
    Zhou, Yipeng
    Liu, Jiangchuan
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2022), 2022, : 1459 - 1468
  • [47] Joint Optimization of Communications and Federated Learning Over the Air
    Fan, Xin
    Wang, Yue
    Huo, Yan
    Tian, Zhi
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (06) : 4434 - 4449
  • [48] Trustworthy Semantic Communications for the Metaverse Relying on Federated Learning
    Chen, Jianrui
    Wang, Jingjing
    Jiang, Chunxiao
    Ren, Yong
    Hanzo, Lajos
    IEEE WIRELESS COMMUNICATIONS, 2023, 30 (04) : 18 - 25
  • [49] A state-of-the-art on federated learning for vehicular communications
    Maroua, Drissi
    VEHICULAR COMMUNICATIONS, 2024, 45
  • [50] Towards efficient communications in federated learning: A contemporary survey
    Zhao, Zihao
    Mao, Yuzhu
    Liu, Yang
    Song, Linqi
    Ouyang, Ye
    Chen, Xinlei
    Ding, Wenbo
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2023, 360 (12): : 8669 - 8703