Personalized Decentralized Federated Learning with Knowledge Distillation

被引:2
|
作者
Jeong, Eunjeong [1 ]
Kountouris, Marios [1 ]
机构
[1] EURECOM, Commun Syst Dept, F-06410 Sophia Antipolis, France
关键词
decentralized federated learning; personalization; knowledge distillation;
D O I
10.1109/ICC45041.2023.10279714
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Personalization in federated learning (FL) functions as a coordinator for clients with high variance in data or behavior. Ensuring the convergence of these clients' models relies on how closely users collaborate with those with similar patterns or preferences. However, it is generally challenging to quantify similarity under limited knowledge about other users' models given to users in a decentralized network. To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models. Each client device can enhance its performance without sharing local data by estimating the similarity between two intermediate outputs from feeding local samples as in knowledge distillation. Our empirical studies demonstrate that the proposed algorithm improves the test accuracy of clients in fewer iterations under highly non-independent and identically distributed (non-i.i.d.) data distributions and is beneficial to agents with small datasets, even without the need for a central server.
引用
收藏
页码:1982 / 1987
页数:6
相关论文
共 50 条
  • [41] Mitigation of Membership Inference Attack by Knowledge Distillation on Federated Learning
    Ueda, Rei
    Nakai, Tsunato
    Yoshida, Kota
    Fujino, Takeshi
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2025, E108A (03) : 267 - 279
  • [42] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2022, 13 (01)
  • [43] Preservation of the Global Knowledge by Not-True Distillation in Federated Learning
    Lee, Gihun
    Jeong, Minchan
    Shin, Yongjin
    Bae, Sangmin
    Yun, Se-Young
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [44] Resource Allocation for Federated Knowledge Distillation Learning in Internet of Drones
    Yao, Jingjing
    Cal, Semih
    Sun, Xiang
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 8064 - 8074
  • [45] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [46] Communication-Efficient Personalized Federated Edge Learning for Decentralized Sensing in ISAC
    Zhu, Yonghui
    Zhang, Ronghui
    Cui, Yuanhao
    Wu, Sheng
    Jiang, Chunxiao
    Jing, Xiaojun
    2023 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS, 2023, : 207 - 212
  • [47] Knowledge-Aware Parameter Coaching for Personalized Federated Learning
    Zhi, Mingjian
    Bi, Yuanguo
    Xu, Wenchao
    Wang, Haozhao
    Xiang, Tianao
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 17069 - 17077
  • [48] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge
    Pan, Yanghe
    Su, Zhou
    Ni, Jianbing
    Wang, Yuntao
    Zhou, Jinhao
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
  • [49] Robust Multi-model Personalized Federated Learning via Model Distillation
    Muhammad, Adil
    Lin, Kai
    Gao, Jian
    Chen, Bincai
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT III, 2022, 13157 : 432 - 446
  • [50] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9