Personalized Decentralized Federated Learning with Knowledge Distillation

被引:2
|
作者
Jeong, Eunjeong [1 ]
Kountouris, Marios [1 ]
机构
[1] EURECOM, Commun Syst Dept, F-06410 Sophia Antipolis, France
关键词
decentralized federated learning; personalization; knowledge distillation;
D O I
10.1109/ICC45041.2023.10279714
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Personalization in federated learning (FL) functions as a coordinator for clients with high variance in data or behavior. Ensuring the convergence of these clients' models relies on how closely users collaborate with those with similar patterns or preferences. However, it is generally challenging to quantify similarity under limited knowledge about other users' models given to users in a decentralized network. To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models. Each client device can enhance its performance without sharing local data by estimating the similarity between two intermediate outputs from feeding local samples as in knowledge distillation. Our empirical studies demonstrate that the proposed algorithm improves the test accuracy of clients in fewer iterations under highly non-independent and identically distributed (non-i.i.d.) data distributions and is beneficial to agents with small datasets, even without the need for a central server.
引用
收藏
页码:1982 / 1987
页数:6
相关论文
共 50 条
  • [31] Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation
    Mohammed, Malik Naik
    Zhang, Xinyue
    Valero, Maria
    Xie, Ying
    2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE, 2023, : 207 - 208
  • [32] Federated Split Learning via Mutual Knowledge Distillation
    Luo, Linjun
    Zhang, Xinglin
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 2729 - 2741
  • [33] FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
    Han, Sungwon
    Park, Sungwon
    Wu, Fangzhao
    Kim, Sundong
    Wu, Chuhan
    Xie, Xing
    Cha, Meeyoung
    COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 691 - 707
  • [34] pFedKT: Personalized federated learning with dual knowledge transfer
    Yi, Liping
    Shi, Xiaorong
    Wang, Nan
    Wang, Gang
    Liu, Xiaoguang
    Shi, Zhuan
    Yu, Han
    KNOWLEDGE-BASED SYSTEMS, 2024, 292
  • [35] Rethinking Personalized Federated Learning from Knowledge Perspective
    Yao, Dezhong
    Zhu, Ziquan
    Liu, Tongtong
    Xu, Zhiqiang
    Jin, Hai
    53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024, 2024, : 991 - 1000
  • [36] Like Attracts Like: Personalized Federated Learning in Decentralized Edge Computing
    Ma, Zhenguo
    Xu, Yang
    Xu, Hongli
    Liu, Jianchun
    Xue, Yinxing
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (02) : 1080 - 1096
  • [37] DKD-pFed: A novel framework for personalized federated learning via decoupling knowledge distillation and feature decorrelation
    Su, Liwei
    Wang, Donghao
    Zhu, Jinghua
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 259
  • [38] Decentralized Two-Stage Federated Learning with Knowledge Transfer
    Jin, Tong
    Chen, Siguang
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3181 - 3186
  • [39] Efficient Federated Learning for AIoT Applications Using Knowledge Distillation
    Liu, Tian
    Xia, Jun
    Ling, Zhiwei
    Fu, Xin
    Yu, Shui
    Chen, Mingsong
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (08) : 7229 - 7243
  • [40] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)