DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION

被引:2
|
作者
Huang, Yue [1 ]
Kong, Lanju [1 ,2 ]
Li, Qingzhong [1 ,2 ]
Zhang, Baochen [1 ]
机构
[1] Shandong Univ, Sch Software, Jinan, Shandong, Peoples R China
[2] Dareway Software Co, Jinan, Peoples R China
关键词
Federated learning; mutual knowledge distillation; decentralized;
D O I
10.1109/ICME55011.2023.00066
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL), an emerging decentralized machine learning paradigm, supports the implementation of common modeling without compromising data privacy. In practical applications, FL participants heterogeneity poses a significant challenge for FL. Firstly, clients sometimes need to design custom models for various scenarios and tasks. Secondly, client drift leads to slow convergence of the global model. Recently, knowledge distillation has emerged to address this problem by using knowledge from heterogeneous clients to improve the model's performance. However, this approach requires the construction of a proxy dataset. And FL is usually performed with the assistance of a center, which can easily lead to trust issues and communication bottlenecks. To address these issues, this paper proposes a knowledge distillation-based FL scheme called FedDCM. Specifically, in this work, each participant maintains two models, a private model and a public model. The two models are mutual distillations, so there is no need to build proxy datasets to train teacher models. The approach allows for model heterogeneity, and each participant can have a private model of any architecture. The direct and efficient exchange of information between participants through the public model is more conducive to improving the participants' private models than a centralized server. Experimental results demonstrate the effectiveness of FedDCM, which offers better performance compared to s the most advanced methods.
引用
收藏
页码:342 / 347
页数:6
相关论文
共 50 条
  • [11] FeDZIO: Decentralized Federated Knowledge Distillation on Edge Devices
    Palazzo, Luca
    Pennisi, Matteo
    Bellitto, Giovanni
    Kavasidis, Isaak
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2023 WORKSHOPS, PT II, 2024, 14366 : 201 - 210
  • [12] FEDGKD: Toward Heterogeneous Federated Learning via Global Knowledge Distillation
    Yao, Dezhong
    Pan, Wanning
    Dai, Yutong
    Wan, Yao
    Ding, Xiaofeng
    Yu, Chen
    Jin, Hai
    Xu, Zheng
    Sun, Lichao
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (01) : 3 - 17
  • [13] Federated Learning via Decentralized Dataset Distillation in Resource-Constrained Edge Environments
    Song, Rui
    Liu, Dai
    Chen, Dave Zhenyu
    Festag, Andreas
    Trinitis, Carsten
    Schulz, Martin
    Knoll, Alois
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [14] Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
    Yang, Chuanguang
    An, Zhulin
    Zhou, Helong
    Zhuang, Fuzhen
    Xu, Yongjun
    Zhang, Qian
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 10212 - 10227
  • [15] Cuing Without Sharing: A Federated Cued Speech Recognition Framework via Mutual Knowledge Distillation
    Zhang, Yuxuan
    Liu, Lei
    Liu, Li
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 8781 - 8789
  • [16] F2MKD: Fog-enabled Federated Learning with Mutual Knowledge Distillation
    Yamasaki, Yusuke
    Takase, Hideki
    2023 IEEE 20TH CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE, CCNC, 2023,
  • [17] WHEN FEDERATED LEARNING MEETS KNOWLEDGE DISTILLATION
    Pang, Xiaoyi
    Hu, Jiahui
    Sun, Peng
    Ren, Ju
    Wang, Zhibo
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (05) : 208 - 214
  • [18] Knowledge Distillation in Federated Learning: A Practical Guide
    Mora, Alessio
    Tenison, Irene
    Bellavista, Paolo
    Rish, Irina
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8188 - 8196
  • [19] Heterogeneous Defect Prediction Based on Federated Transfer Learning via Knowledge Distillation
    Wang, Aili
    Zhang, Yutong
    Yan, Yixin
    IEEE ACCESS, 2021, 9 : 29530 - 29540
  • [20] Personalized and privacy-enhanced federated learning framework via knowledge distillation
    Yu, Fangchao
    Wang, Lina
    Zeng, Bo
    Zhao, Kai
    Yu, Rongwei
    NEUROCOMPUTING, 2024, 575