DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION

被引:2
|
作者
Huang, Yue [1 ]
Kong, Lanju [1 ,2 ]
Li, Qingzhong [1 ,2 ]
Zhang, Baochen [1 ]
机构
[1] Shandong Univ, Sch Software, Jinan, Shandong, Peoples R China
[2] Dareway Software Co, Jinan, Peoples R China
关键词
Federated learning; mutual knowledge distillation; decentralized;
D O I
10.1109/ICME55011.2023.00066
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL), an emerging decentralized machine learning paradigm, supports the implementation of common modeling without compromising data privacy. In practical applications, FL participants heterogeneity poses a significant challenge for FL. Firstly, clients sometimes need to design custom models for various scenarios and tasks. Secondly, client drift leads to slow convergence of the global model. Recently, knowledge distillation has emerged to address this problem by using knowledge from heterogeneous clients to improve the model's performance. However, this approach requires the construction of a proxy dataset. And FL is usually performed with the assistance of a center, which can easily lead to trust issues and communication bottlenecks. To address these issues, this paper proposes a knowledge distillation-based FL scheme called FedDCM. Specifically, in this work, each participant maintains two models, a private model and a public model. The two models are mutual distillations, so there is no need to build proxy datasets to train teacher models. The approach allows for model heterogeneity, and each participant can have a private model of any architecture. The direct and efficient exchange of information between participants through the public model is more conducive to improving the participants' private models than a centralized server. Experimental results demonstrate the effectiveness of FedDCM, which offers better performance compared to s the most advanced methods.
引用
收藏
页码:342 / 347
页数:6
相关论文
共 50 条
  • [21] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [22] FMDL: Federated Mutual Distillation Learning for Defending Backdoor Attacks
    Sun, Hanqi
    Zhu, Wanquan
    Sun, Ziyu
    Cao, Mingsheng
    Liu, Wenbin
    ELECTRONICS, 2023, 12 (23)
  • [23] Heterogeneous Federated Learning via Generative Model-Aided Knowledge Distillation in the Edge
    Sun, Chuanneng
    Jiang, Tingcong
    Pompili, Dario
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (05): : 5589 - 5599
  • [24] A Semi-Supervised Federated Learning Scheme via Knowledge Distillation for Intrusion Detection
    Zhao, Ruijie
    Yang, Linbo
    Wang, Yijun
    Xue, Zhi
    Gui, Guan
    Ohtsukit, Tomoaki
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 2688 - 2693
  • [25] Resource-Aware Knowledge Distillation for Federated Learning
    Chen, Zheyi
    Tian, Pu
    Liao, Weixian
    Chen, Xuhui
    Xu, Guobin
    Yu, Wei
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) : 706 - 719
  • [26] Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation
    Mohammed, Malik Naik
    Zhang, Xinyue
    Valero, Maria
    Xie, Ying
    2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE, 2023, : 207 - 208
  • [27] FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
    Han, Sungwon
    Park, Sungwon
    Wu, Fangzhao
    Kim, Sundong
    Wu, Chuhan
    Xie, Xing
    Cha, Meeyoung
    COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 691 - 707
  • [28] Mutual Knowledge-Distillation-Based Federated Learning for Short-Term Forecasting in Electric IoT Systems
    Tong, Cheng
    Zhang, Linghua
    Ding, Yin
    Yue, Dong
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (19): : 31190 - 31205
  • [29] Decentralized Federated Graph Learning via Surrogate Model
    Zhang, Bolin
    Gu, Ruichun
    Liu, Haiying
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (02): : 2521 - 2535
  • [30] FedCD: Personalized Federated Learning via Collaborative Distillation
    Ahmad, Sabtain
    Aral, Atakan
    2022 IEEE/ACM 15TH INTERNATIONAL CONFERENCE ON UTILITY AND CLOUD COMPUTING, UCC, 2022, : 189 - 194