Fed2Com: Towards Efficient Compression in Federated Learning

被引:0
|
作者
Zhang, Yu [1 ]
Lin, Wei [1 ]
Chen, Sisi [1 ]
Song, Qingyu [1 ]
Lu, Jiaxun [2 ]
Shao, Yunfeng [2 ]
Yu, Bei [1 ]
Xu, Hong [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
[2] Huawei Technol, Noahs Ark Lab, Shenzhen, Peoples R China
来源
2024 INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKING AND COMMUNICATIONS, ICNC | 2024年
关键词
Federated learning; Communication compression; Non-i.i.d data;
D O I
10.1109/CNC59896.2024.10556165
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a distributed machine learning system that enables multiple clients to collaboratively train a machine learning model without sacrificing data privacy. In the last few years, various biased compression techniques have been proposed to alleviate the communication bottleneck in FL. However, these approaches rely on an ideal setting where all clients participate and continuously send their local errors to the cloud server. In this paper, we design a communication-efficient algorithmic framework called Fed2Com for FL with non-i.i.d datasets. In particular, Fed2Com has a two-level structure: At the client side, it leverages unbiased compression methods, e.g., rand-k sparsification, to compress the upload communication, avoiding leaving errors at the client. Then on the server side, Fed2Com applies biased compressors, e.g., top-k sparsification, with error correction to compress the download communication while stabilizing the training process. Fed2Com can achieve high compression ratio while maintaining robust performance against data heterogeneity. We conduct extensive experiments on MNIST, CIFAR10, Sentiment140 and PersonaChat datasets, and the evaluation results reveal the effectiveness of Fed2Com.
引用
收藏
页码:560 / 566
页数:7
相关论文
共 50 条
  • [31] Towards More Efficient Data Valuation in Healthcare Federated Learning Using Ensembling
    Kumar, Sourav
    Lakshminarayanan, A.
    Chang, Ken
    Guretno, Feri
    Mien, Ivan Ho
    Kalpathy-Cramer, Jayashree
    Krishnaswamy, Pavitra
    Singh, Praveer
    DISTRIBUTED, COLLABORATIVE, AND FEDERATED LEARNING, AND AFFORDABLE AI AND HEALTHCARE FOR RESOURCE DIVERSE GLOBAL HEALTH, DECAF 2022, FAIR 2022, 2022, 13573 : 119 - 129
  • [32] Towards Efficient Federated Learning via Vehicle Selection and Resource Optimization in IoV
    Gong, Nan
    Yan, Guozhi
    Zhang, Hao
    Xiao, Ke
    Yang, Zuoxiu
    Li, Chuzhao
    Liu, Kai
    NEURAL COMPUTING FOR ADVANCED APPLICATIONS, NCAA 2024, PT II, 2025, 2182 : 117 - 131
  • [33] FedRich: Towards efficient federated learning for heterogeneous clients using heuristic scheduling
    Yang, He
    Xi, Wei
    Wang, Zizhao
    Shen, Yuhao
    Ji, Xinyuan
    Sun, Cerui
    Zhao, Jizhong
    INFORMATION SCIENCES, 2023, 645
  • [34] MAS: Towards Resource-Efficient Federated Multiple-Task Learning
    Zhuang, Weiming
    Wen, Yonggang
    Lyu, Lingjuan
    Zhang, Shuai
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 23357 - 23367
  • [35] SimFBO: Towards Simple, Flexible and Communication-efficient Federated Bilevel Learning
    Yang, Yifan
    Xiao, Peiyao
    Ji, Kaiyi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [36] PAS: Towards Accurate and Efficient Federated Learning with Parameter-Adaptive Synchronization
    Gan, Zuo
    Chen, Chen
    Zhang, Jiayi
    Zeng, Gaoxiong
    Zhu, Yifei
    Zhao, Jieru
    Chen, Quan
    Guo, Minyi
    2024 IEEE/ACM 32ND INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE, IWQOS, 2024,
  • [37] Research on Efficient Federated Learning Communication Mechanism Based on Adaptive Gradient Compression br
    Tang, Lun
    Wang, Zhiping
    Pu, Hao
    Wu, Zhuang
    Chen, Qianbin
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2023, 45 (01) : 227 - 234
  • [38] Fed2A: Federated Learning Mechanism in Asynchronous and Adaptive Modes
    Liu, Sheng
    Chen, Qiyang
    You, Linlin
    ELECTRONICS, 2022, 11 (09)
  • [39] CE-Fed: Communication efficient multi-party computation enabled federated learning
    Kanagavelu, Renuga
    Wei, Qingsong
    Li, Zengxiang
    Zhang, Haibin
    Samsudin, Juniarto
    Yang, Yechao
    Goh, Rick Siow Mong
    Wang, Shangguang
    ARRAY, 2022, 15
  • [40] Towards a Federated Fuzzy Learning System
    Wilbik, Anna
    Grefen, Paul
    IEEE CIS INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS 2021 (FUZZ-IEEE), 2021,