Two-stage model fusion scheme based on knowledge distillation for stragglers in federated learning

被引:0
|
作者
Xu, Jiuyun [1 ]
Li, Xiaowen [1 ]
Zhu, Kongshang [1 ]
Zhou, Liang [1 ]
Zhao, Yingzhi [1 ]
机构
[1] China Univ Petr East China, Qingdao Inst Software, Coll Comp Sci & Technol, 66 Changjiang West Rd, Qingdao 266580, Peoples R China
关键词
Federated learning; Straggler problem; Knowledge distillation; Heterogeneity; Training efficiency;
D O I
10.1007/s13042-024-02436-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning, as an emerging distributed learning paradigm, enables devices (also called clients) storing local data to collaboratively participate in a training task without the data leaving the devices, aiming to achieve the effect of integrating multiparty data while meeting privacy protection requirements. However, the participating clients are autonomous entities in a real-world environment, with heterogeneity and network instability, which leads to FL being plagued by stragglers when intermediate training results are synchronously interacted. To this end, this paper proposes a new FL scheme with a two-stage fusion process based on knowledge distillation, which transfers knowledge of straggler models to the global model without delaying the training speed, thus balancing efficiency and model performance. We have evaluated the proposed algorithm on three popular datasets. The experimental results show that FedTd improves training efficiency and maintains good model accuracy compared to baseline methods under heterogeneous conditions, exhibiting strong robustness against stragglers. By our approach, the running time can be accelerated by 1.97-3.32x\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$3.32\times$$\end{document} under scenarios with higher level of data heterogeneity.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
  • [22] FedDKD: Federated learning with decentralized knowledge distillation
    Li, Xinjia
    Chen, Boyu
    Lu, Wenlian
    APPLIED INTELLIGENCE, 2023, 53 (15) : 18547 - 18563
  • [23] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563
  • [24] A TWO-STAGE MODEL BASED ITERATIVE LEARNING CONTROL SCHEME FOR A CLASS OF MIMO MISMATCHED LINEAR SYSTEMS
    Chen, Wenjie
    Tomizuka, Masayoshi
    PROCEEDINGS OF THE ASME/ISCIE INTERNATIONAL SYMPOSIUM ON FLEXIBLE AUTOMATION, ISFA 2012, 2013, : 159 - 166
  • [25] Two-Stage Edge-Side Fault Diagnosis Method Based on Double Knowledge Distillation
    Yang, Yang
    Long, Yuhan
    Lin, Yijing
    Gao, Zhipeng
    Rui, Lanlan
    Yu, Peng
    CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 76 (03): : 3623 - 3651
  • [26] TDCF: A two-stage deep learning based recommendation model
    Wang R.
    Cheng H.K.
    Jiang Y.
    Lou J.
    Wang, Ruiqin (wrq@zjhu.edu.cn), 1600, Elsevier Ltd (145):
  • [27] An Integrated Recommendation Model Based on Two-stage Deep Learning
    Wang R.
    Wu Z.
    Jiang Y.
    Lou J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2019, 56 (08): : 1661 - 1669
  • [28] Heterogeneous Federated Learning Framework for IIoT Based on Selective Knowledge Distillation
    Guo, Sheng
    Chen, Hui
    Liu, Yang
    Yang, Chengyi
    Li, Zengxiang
    Jin, Cheng Hao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (02) : 1078 - 1089
  • [29] A Genetic-Algorithm-based Two-Stage Learning Scheme for Neural Networks
    Wang, Shuo
    Zhang, Xiaomeng
    Zheng, Xuanyan
    Yuan, Bingzhi
    2010 INTERNATIONAL CONFERENCE ON E-EDUCATION, E-BUSINESS, E-MANAGEMENT AND E-LEARNING: IC4E 2010, PROCEEDINGS, 2010, : 391 - 394
  • [30] A Federated Domain Adaptation Algorithm Based on Knowledge Distillation and Contrastive Learning
    HUANG Fang
    FANG Zhijun
    SHI Zhicai
    ZHUANG Lehui
    LI Xingchen
    HUANG Bo
    WuhanUniversityJournalofNaturalSciences, 2022, 27 (06) : 499 - 507