Two-stage model fusion scheme based on knowledge distillation for stragglers in federated learning

被引:0
|
作者
Xu, Jiuyun [1 ]
Li, Xiaowen [1 ]
Zhu, Kongshang [1 ]
Zhou, Liang [1 ]
Zhao, Yingzhi [1 ]
机构
[1] China Univ Petr East China, Qingdao Inst Software, Coll Comp Sci & Technol, 66 Changjiang West Rd, Qingdao 266580, Peoples R China
关键词
Federated learning; Straggler problem; Knowledge distillation; Heterogeneity; Training efficiency;
D O I
10.1007/s13042-024-02436-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning, as an emerging distributed learning paradigm, enables devices (also called clients) storing local data to collaboratively participate in a training task without the data leaving the devices, aiming to achieve the effect of integrating multiparty data while meeting privacy protection requirements. However, the participating clients are autonomous entities in a real-world environment, with heterogeneity and network instability, which leads to FL being plagued by stragglers when intermediate training results are synchronously interacted. To this end, this paper proposes a new FL scheme with a two-stage fusion process based on knowledge distillation, which transfers knowledge of straggler models to the global model without delaying the training speed, thus balancing efficiency and model performance. We have evaluated the proposed algorithm on three popular datasets. The experimental results show that FedTd improves training efficiency and maintains good model accuracy compared to baseline methods under heterogeneous conditions, exhibiting strong robustness against stragglers. By our approach, the running time can be accelerated by 1.97-3.32x\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$3.32\times$$\end{document} under scenarios with higher level of data heterogeneity.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Decentralized Two-Stage Federated Learning with Knowledge Transfer
    Jin, Tong
    Chen, Siguang
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3181 - 3186
  • [2] A Two-Stage Differential Privacy Scheme for Federated Learning Based on Edge Intelligence
    Zhang, Li
    Xu, Jianbo
    Sivaraman, Audithan
    Lazarus, Jegatha Deborah
    Sharma, Pradip Kumar
    Pandi, Vijayakumar
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (06) : 3349 - 3360
  • [3] Two-stage personalized federated learning based on sparse pretraining
    Liu, Tong
    Xie, Kaixuan
    Kong, Yi
    Chen, Guojun
    Xu, Yinfei
    Xin, Lun
    Yu, Fei
    ELECTRONICS LETTERS, 2023, 59 (17)
  • [4] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [5] Two-Stage Client Selection Scheme for Blockchain-Enabled Federated Learning in IoT
    Jin, Xiaojun
    Ma, Chao
    Luo, Song
    Zeng, Pengyi
    Wei, Yifei
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 81 (02): : 2317 - 2336
  • [6] A novel staged training strategy leveraging knowledge distillation and model fusion for heterogeneous federated learning
    Wang, Debao
    Guan, Shaopeng
    Sun, Ruikang
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2025, 236
  • [7] Two-stage Federated Phenotyping and Patient Representation Learning
    Liu, Dianbo
    Dligach, Dmitriy
    Miller, Timothy
    SIGBIOMED WORKSHOP ON BIOMEDICAL NATURAL LANGUAGE PROCESSING (BIONLP 2019), 2019, : 283 - 291
  • [8] FedTSA: A Cluster-Based Two-Stage Aggregation Method for Model-Heterogeneous Federated Learning
    Fan, Boyu
    Wu, Chenrui
    Su, Xiang
    Hui, Pan
    COMPUTER VISION - ECCV 2024, PT LXXXIII, 2025, 15141 : 370 - 389
  • [9] Two-Stage Approach for Targeted Knowledge Transfer in Self-Knowledge Distillation
    Yin, Zimo
    Pu, Jian
    Zhou, Yijie
    Xue, Xiangyang
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2024, 11 (11) : 2270 - 2283
  • [10] A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation
    Zhou, Zhongchang
    Sun, Fenggang
    Chen, Xiangyu
    Zhang, Dongxu
    Han, Tianzhen
    Lan, Peng
    MATHEMATICS, 2023, 11 (14)