Personalized Decentralized Federated Learning with Knowledge Distillation

被引:2
|
作者
Jeong, Eunjeong [1 ]
Kountouris, Marios [1 ]
机构
[1] EURECOM, Commun Syst Dept, F-06410 Sophia Antipolis, France
关键词
decentralized federated learning; personalization; knowledge distillation;
D O I
10.1109/ICC45041.2023.10279714
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Personalization in federated learning (FL) functions as a coordinator for clients with high variance in data or behavior. Ensuring the convergence of these clients' models relies on how closely users collaborate with those with similar patterns or preferences. However, it is generally challenging to quantify similarity under limited knowledge about other users' models given to users in a decentralized network. To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models. Each client device can enhance its performance without sharing local data by estimating the similarity between two intermediate outputs from feeding local samples as in knowledge distillation. Our empirical studies demonstrate that the proposed algorithm improves the test accuracy of clients in fewer iterations under highly non-independent and identically distributed (non-i.i.d.) data distributions and is beneficial to agents with small datasets, even without the need for a central server.
引用
收藏
页码:1982 / 1987
页数:6
相关论文
共 50 条
  • [21] Personalized Federated Learning on long-tailed data via knowledge distillation and generated features
    Lv, Fengling
    Qian, Pinxin
    Lu, Yang
    Wang, Hanzi
    PATTERN RECOGNITION LETTERS, 2024, 186 : 178 - 183
  • [22] Personalized federated learning via decoupling self-knowledge distillation and global adaptive aggregation
    Tang, Zhiwei
    Xu, Shuwei
    Jin, Haozhe
    Liu, Shichong
    Zhai, Rui
    Lu, Ke
    MULTIMEDIA SYSTEMS, 2025, 31 (02)
  • [23] PFL-DKD: Modeling decoupled knowledge fusion with distillation for improving personalized federated learning
    Ge, Huanhuan
    Pokhrel, Shiva Raj
    Liu, Zhenyu
    Wang, Jinlong
    Li, Gang
    COMPUTER NETWORKS, 2024, 254
  • [24] WHEN FEDERATED LEARNING MEETS KNOWLEDGE DISTILLATION
    Pang, Xiaoyi
    Hu, Jiahui
    Sun, Peng
    Ren, Ju
    Wang, Zhibo
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (05) : 208 - 214
  • [25] Knowledge Distillation in Federated Learning: A Practical Guide
    Mora, Alessio
    Tenison, Irene
    Bellavista, Paolo
    Rish, Irina
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8188 - 8196
  • [26] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [27] Parameterized Knowledge Transfer for Personalized Federated Learning
    Zhang, Jie
    Guo, Song
    Ma, Xiaosong
    Wang, Haozhao
    Xu, Wencao
    Wu, Feijie
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [28] Personalized Edge Intelligence via Federated Self-Knowledge Distillation
    Jin, Hai
    Bai, Dongshan
    Yao, Dezhong
    Dai, Yutong
    Gu, Lin
    Yu, Chen
    Sun, Lichao
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (02) : 567 - 580
  • [29] Decentralized Federated Learning via Mutual Knowledge Transfer
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (02) : 1136 - 1147
  • [30] Resource-Aware Knowledge Distillation for Federated Learning
    Chen, Zheyi
    Tian, Pu
    Liao, Weixian
    Chen, Xuhui
    Xu, Guobin
    Yu, Wei
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) : 706 - 719