Fed2Com: Towards Efficient Compression in Federated Learning

被引:0
|
作者
Zhang, Yu [1 ]
Lin, Wei [1 ]
Chen, Sisi [1 ]
Song, Qingyu [1 ]
Lu, Jiaxun [2 ]
Shao, Yunfeng [2 ]
Yu, Bei [1 ]
Xu, Hong [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
[2] Huawei Technol, Noahs Ark Lab, Shenzhen, Peoples R China
关键词
Federated learning; Communication compression; Non-i.i.d data;
D O I
10.1109/CNC59896.2024.10556165
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a distributed machine learning system that enables multiple clients to collaboratively train a machine learning model without sacrificing data privacy. In the last few years, various biased compression techniques have been proposed to alleviate the communication bottleneck in FL. However, these approaches rely on an ideal setting where all clients participate and continuously send their local errors to the cloud server. In this paper, we design a communication-efficient algorithmic framework called Fed2Com for FL with non-i.i.d datasets. In particular, Fed2Com has a two-level structure: At the client side, it leverages unbiased compression methods, e.g., rand-k sparsification, to compress the upload communication, avoiding leaving errors at the client. Then on the server side, Fed2Com applies biased compressors, e.g., top-k sparsification, with error correction to compress the download communication while stabilizing the training process. Fed2Com can achieve high compression ratio while maintaining robust performance against data heterogeneity. We conduct extensive experiments on MNIST, CIFAR10, Sentiment140 and PersonaChat datasets, and the evaluation results reveal the effectiveness of Fed2Com.
引用
收藏
页码:560 / 566
页数:7
相关论文
共 50 条
  • [1] Towards Efficient Decentralized Federated Learning
    Pappas, Christodoulos
    Papadopoulos, Dimitrios
    Chatzopoulos, Dimitris
    Panagou, Eleni
    Lalis, Spyros
    Vavalis, Manolis
    2022 IEEE 42ND INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS WORKSHOPS (ICDCSW), 2022, : 79 - 85
  • [2] Model Compression for Communication Efficient Federated Learning
    Shah, Suhail Mohmad
    Lau, Vincent K. N.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 5937 - 5951
  • [3] Towards Efficient Replay in Federated Incremental Learning
    Li, Yichen
    Li, Qunwei
    Wang, Haozhao
    Li, Ruixuan
    Zhong, Wenliang
    Zhang, Guannan
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 12820 - 12829
  • [4] Intrinsic Gradient Compression for Scalable and Efficient Federated Learning
    Melas-Kyriazi, Luke
    Wang, Franklyn
    PROCEEDINGS OF THE FIRST WORKSHOP ON FEDERATED LEARNING FOR NATURAL LANGUAGE PROCESSING (FL4NLP 2022), 2022, : 27 - 41
  • [5] Efficient Client Sampling with Compression in Heterogeneous Federated Learning
    Marnissi, Ouiame
    El Hammouti, Hajar
    Bergou, El Houcine
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS, INFOCOM WKSHPS 2024, 2024,
  • [6] Ternary Compression for Communication-Efficient Federated Learning
    Xu, Jinjin
    Du, Wenli
    Jin, Yaochu
    He, Wangli
    Cheng, Ran
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (03) : 1162 - 1176
  • [7] Marvel: Towards Efficient Federated Learning on IoT Devices
    Liu, Libin
    Xu, Xiuting
    COMPUTER NETWORKS, 2024, 245
  • [8] Towards efficient communications in federated learning: A contemporary survey
    Zhao, Zihao
    Mao, Yuzhu
    Liu, Yang
    Song, Linqi
    Ouyang, Ye
    Chen, Xinlei
    Ding, Wenbo
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2023, 360 (12): : 8669 - 8703
  • [9] Dual Adaptive Compression for Efficient Communication in Heterogeneous Federated Learning
    Mao, Yingchi
    Wang, Zibo
    Li, Chenxin
    Zhang, Jiakai
    Xu, Shufang
    Wu, Jie
    2024 IEEE 24TH INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING, CCGRID 2024, 2024, : 236 - 244
  • [10] Towards Efficient and Privacy-preserving Federated Deep Learning
    Hao, Meng
    Li, Hongwei
    Xu, Guowen
    Liu, Sen
    Yang, Haomiao
    ICC 2019 - 2019 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2019,