DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION

被引:2
|
作者
Huang, Yue [1 ]
Kong, Lanju [1 ,2 ]
Li, Qingzhong [1 ,2 ]
Zhang, Baochen [1 ]
机构
[1] Shandong Univ, Sch Software, Jinan, Shandong, Peoples R China
[2] Dareway Software Co, Jinan, Peoples R China
关键词
Federated learning; mutual knowledge distillation; decentralized;
D O I
10.1109/ICME55011.2023.00066
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL), an emerging decentralized machine learning paradigm, supports the implementation of common modeling without compromising data privacy. In practical applications, FL participants heterogeneity poses a significant challenge for FL. Firstly, clients sometimes need to design custom models for various scenarios and tasks. Secondly, client drift leads to slow convergence of the global model. Recently, knowledge distillation has emerged to address this problem by using knowledge from heterogeneous clients to improve the model's performance. However, this approach requires the construction of a proxy dataset. And FL is usually performed with the assistance of a center, which can easily lead to trust issues and communication bottlenecks. To address these issues, this paper proposes a knowledge distillation-based FL scheme called FedDCM. Specifically, in this work, each participant maintains two models, a private model and a public model. The two models are mutual distillations, so there is no need to build proxy datasets to train teacher models. The approach allows for model heterogeneity, and each participant can have a private model of any architecture. The direct and efficient exchange of information between participants through the public model is more conducive to improving the participants' private models than a centralized server. Experimental results demonstrate the effectiveness of FedDCM, which offers better performance compared to s the most advanced methods.
引用
收藏
页码:342 / 347
页数:6
相关论文
共 50 条
  • [41] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [42] A Personalized Federated Learning Method Based on Clustering and Knowledge Distillation
    Zhang, Jianfei
    Shi, Yongqiang
    ELECTRONICS, 2024, 13 (05)
  • [43] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [44] A Personalized Federated Learning Algorithm Based on Meta-Learning and Knowledge Distillation
    Sun Y.
    Shi Y.
    Wang Z.
    Li M.
    Si P.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (01): : 12 - 18
  • [45] Toward Scalable and Robust AIoT via Decentralized Federated Learning
    Pinyoanuntapong P.
    Huff W.H.
    Lee M.
    Chen C.
    Wang P.
    IEEE Internet of Things Magazine, 2022, 5 (01): : 30 - 35
  • [46] Chain FL: Decentralized Federated Machine Learning via Blockchain
    Korkmaz, Caner
    Kocas, Halil Eralp
    Uysal, Ahmet
    Masry, Ahmed
    Ozkasap, Oznur
    Akgun, Baris
    2020 SECOND INTERNATIONAL CONFERENCE ON BLOCKCHAIN COMPUTING AND APPLICATIONS (BCCA), 2020, : 140 - 146
  • [47] COOPERATIVE LEARNING VIA FEDERATED DISTILLATION OVER FADING CHANNELS
    Ahn, Jin-Hyun
    Simeone, Osvaldo
    Kang, Joonhyuk
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 8856 - 8860
  • [48] FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation
    Tang, Jianwu
    Ding, Xuefeng
    Hu, Dasha
    Guo, Bing
    Shen, Yuncheng
    Ma, Pan
    Jiang, Yuming
    SENSORS, 2023, 23 (14)
  • [49] QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning
    Ozkara, Kaan
    Singh, Navjot
    Data, Deepesh
    Diggavi, Suhas
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [50] Federated Learning with Extremely Noisy Clients via Negative Distillation
    Lu, Yang
    Chen, Lin
    Zhang, Yonggang
    Zhang, Yiliang
    Han, Bo
    Cheung, Yiu-ming
    Wang, Hanzi
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 13, 2024, : 14184 - 14192