Two-stage model fusion scheme based on knowledge distillation for stragglers in federated learning

被引:0
|
作者
Xu, Jiuyun [1 ]
Li, Xiaowen [1 ]
Zhu, Kongshang [1 ]
Zhou, Liang [1 ]
Zhao, Yingzhi [1 ]
机构
[1] China Univ Petr East China, Qingdao Inst Software, Coll Comp Sci & Technol, 66 Changjiang West Rd, Qingdao 266580, Peoples R China
关键词
Federated learning; Straggler problem; Knowledge distillation; Heterogeneity; Training efficiency;
D O I
10.1007/s13042-024-02436-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning, as an emerging distributed learning paradigm, enables devices (also called clients) storing local data to collaboratively participate in a training task without the data leaving the devices, aiming to achieve the effect of integrating multiparty data while meeting privacy protection requirements. However, the participating clients are autonomous entities in a real-world environment, with heterogeneity and network instability, which leads to FL being plagued by stragglers when intermediate training results are synchronously interacted. To this end, this paper proposes a new FL scheme with a two-stage fusion process based on knowledge distillation, which transfers knowledge of straggler models to the global model without delaying the training speed, thus balancing efficiency and model performance. We have evaluated the proposed algorithm on three popular datasets. The experimental results show that FedTd improves training efficiency and maintains good model accuracy compared to baseline methods under heterogeneous conditions, exhibiting strong robustness against stragglers. By our approach, the running time can be accelerated by 1.97-3.32x\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$3.32\times$$\end{document} under scenarios with higher level of data heterogeneity.
引用
收藏
页数:17
相关论文
共 50 条
  • [31] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    ELECTRONICS, 2024, 13 (17)
  • [32] FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation
    Chen, Leiming
    Zhang, Weishan
    Dong, Cihao
    Zhao, Dehai
    Zeng, Xingjie
    Qiao, Sibo
    Zhu, Yichang
    Tan, Chee Wei
    ENTROPY, 2024, 26 (01)
  • [33] Personalized Federated Learning Method Based on Collation Game and Knowledge Distillation
    Sun Y.
    Shi Y.
    Li M.
    Yang R.
    Si P.
    Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2023, 45 (10): : 3702 - 3709
  • [34] Knowledge Distillation Based Defense for Audio Trigger Backdoor in Federated Learning
    Chen, Yu-Wen
    Ke, Bo-Hsu
    Chen, Bo-Zhong
    Chiu, Si-Rong
    Tu, Chun-Wei
    Kuo, Jian-Jhih
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 4271 - 4276
  • [35] A two-stage federated learning method for personalization via selective collaboration
    Xu, Jiuyun
    Zhou, Liang
    Zhao, Yingzhi
    Li, Xiaowen
    Zhu, Kongshang
    Xu, Xiangrui
    Duan, Qiang
    Zhang, Ruru
    COMPUTER COMMUNICATIONS, 2025, 232
  • [36] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning
    Lyu, Feng
    Tang, Cheng
    Deng, Yongheng
    Liu, Tong
    Zhang, Yongmin
    Zhang, Yaoxue
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
  • [37] FedRCIL: Federated Knowledge Distillation for Representation based Contrastive Incremental Learning
    Psaltis, Athanasios
    Chatzikonstantinou, Christos
    Patrikakis, Charalampos Z.
    Daras, Petros
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3455 - 3464
  • [38] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System
    Yang, Ze
    Shou, Linjun
    Gong, Ming
    Lin, Wutao
    Jiang, Daxin
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
  • [39] PFL-DKD: Modeling decoupled knowledge fusion with distillation for improving personalized federated learning
    Ge, Huanhuan
    Pokhrel, Shiva Raj
    Liu, Zhenyu
    Wang, Jinlong
    Li, Gang
    COMPUTER NETWORKS, 2024, 254
  • [40] Resource-Aware Knowledge Distillation for Federated Learning
    Chen, Zheyi
    Tian, Pu
    Liao, Weixian
    Chen, Xuhui
    Xu, Guobin
    Yu, Wei
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) : 706 - 719