Fed2Com: Towards Efficient Compression in Federated Learning

被引:0
|
作者
Zhang, Yu [1 ]
Lin, Wei [1 ]
Chen, Sisi [1 ]
Song, Qingyu [1 ]
Lu, Jiaxun [2 ]
Shao, Yunfeng [2 ]
Yu, Bei [1 ]
Xu, Hong [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
[2] Huawei Technol, Noahs Ark Lab, Shenzhen, Peoples R China
来源
2024 INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKING AND COMMUNICATIONS, ICNC | 2024年
关键词
Federated learning; Communication compression; Non-i.i.d data;
D O I
10.1109/CNC59896.2024.10556165
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a distributed machine learning system that enables multiple clients to collaboratively train a machine learning model without sacrificing data privacy. In the last few years, various biased compression techniques have been proposed to alleviate the communication bottleneck in FL. However, these approaches rely on an ideal setting where all clients participate and continuously send their local errors to the cloud server. In this paper, we design a communication-efficient algorithmic framework called Fed2Com for FL with non-i.i.d datasets. In particular, Fed2Com has a two-level structure: At the client side, it leverages unbiased compression methods, e.g., rand-k sparsification, to compress the upload communication, avoiding leaving errors at the client. Then on the server side, Fed2Com applies biased compressors, e.g., top-k sparsification, with error correction to compress the download communication while stabilizing the training process. Fed2Com can achieve high compression ratio while maintaining robust performance against data heterogeneity. We conduct extensive experiments on MNIST, CIFAR10, Sentiment140 and PersonaChat datasets, and the evaluation results reveal the effectiveness of Fed2Com.
引用
收藏
页码:560 / 566
页数:7
相关论文
共 50 条
  • [41] Towards Federated Learning on the Quantum Internet
    Suenkel, Leo
    Koelle, Michael
    Rohe, Tobias
    Gabor, Thomas
    COMPUTATIONAL SCIENCE, ICCS 2024, PT VI, 2024, 14937 : 330 - 344
  • [42] Towards Federated Unsupervised Representation Learning
    van Berlo, Bram
    Saeed, Aaqib
    Ozcelebi, Tanir
    PROCEEDINGS OF THE THIRD ACM INTERNATIONAL WORKSHOP ON EDGE SYSTEMS, ANALYTICS AND NETWORKING (EDGESYS'20), 2020, : 31 - 36
  • [43] Compression Boosts Differentially Private Federated Learning
    Kerkouche, Raouf
    Acs, Gergely
    Castelluccia, Claude
    Geneves, Pierre
    2021 IEEE EUROPEAN SYMPOSIUM ON SECURITY AND PRIVACY (EUROS&P 2021), 2021, : 304 - 318
  • [44] Compression with Exact Error Distribution for Federated Learning
    Hegazy, Mahmoud
    Leluc, Remi
    Li, Cheuk Ting
    Dieuleveut, Aymeric
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [45] FederatedScope-GNN: Towards a Unified, Comprehensive and Efficient Package for Federated Graph Learning
    Wang, Zhen
    Kuang, Weirui
    Xie, Yuexiang
    Yao, Liuyi
    Li, Yaliang
    Ding, Bolin
    Zhou, Jingren
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4110 - 4120
  • [46] Towards Communication-Efficient Federated Graph Learning: An Adaptive Client Selection Perspective
    Gao, Xianjun
    Liu, Jianchun
    Xu, Hongli
    Mai, Qianpiao
    Wang, Lun
    2024 IEEE/ACM 32ND INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE, IWQOS, 2024,
  • [47] Towards a resource-efficient semi-asynchronous federated learning for heterogeneous devices
    Sasindran, Zitha
    Yelchuri, Harsha
    Prabhakar, T. V.
    2024 NATIONAL CONFERENCE ON COMMUNICATIONS, NCC, 2024,
  • [48] SparSFA: Towards robust and communication-efficient peer-to-peer federated learning
    Wang, Han
    Munoz-Gonzalez, Luis
    Hameed, Muhammad Zaid
    Eklund, David
    Raza, Shahid
    COMPUTERS & SECURITY, 2023, 129
  • [49] EVFL: Towards Efficient Verifiable Federated Learning via Parameter Reuse and Adaptive Sparsification
    Wu, Jianping
    Wu, Chunming
    Chen, Chaochao
    Jin, Jiahe
    Zhou, Chuan
    MATHEMATICS, 2024, 12 (16)
  • [50] Towards Fairer and More Efficient Federated Learning via Multidimensional Personalized Edge Models
    Wang, Yingchun
    Guo, Jingcai
    Zhang, Jie
    Gut, Song
    Zhang, Weizhan
    Zheng, Qinghua
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,