FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning

被引:3
|
作者
Xu, Yikai [1 ]
Fan, Hongbo [2 ]
机构
[1] Kunming Univ Sci & Technol, Sch Informat Engn & Automat, Kunming 650500, Peoples R China
[2] Kunming Univ Sci & Technol, Sch Fac Modern Agr Engn, Kunming 650500, Peoples R China
来源
IEEE ACCESS | 2023年 / 11卷
关键词
Federated learning; knowledge distillation; personalization; transfer learning; healthcare;
D O I
10.1109/ACCESS.2023.3294812
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
For most healthcare organizations, a significant challenge today is predicting diseases with incomplete data information, often resulting in isolation. Federated learning (FL) solves the issue of data silos by enabling remote local machines to train a globally optimal model collaboratively without the need for sharing data. In this research, we present FedDK, a serverless framework designed to obtain personalized models for each federation through data from local federations using convolutional neural networks and training through FL. Our approach involves using convolutional neural networks (CNNs) to accumulate common knowledge and transfer it using knowledge distillation, which helps prevent common knowledge forgetting. Additionally, the missing common knowledge is filled circularly between each federation, culminating in a personalized model for each group. This novel design leverages federated, deep, and integrated learning methods to produce more accurate machine-learning models. Our federated model exhibits superior performance to local and baseline FL methods, achieving significant advantages.
引用
收藏
页码:72409 / 72417
页数:9
相关论文
共 50 条
  • [31] FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
    Han, Sungwon
    Park, Sungwon
    Wu, Fangzhao
    Kim, Sundong
    Wu, Chuhan
    Xie, Xing
    Cha, Meeyoung
    COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 691 - 707
  • [32] pFedKT: Personalized federated learning with dual knowledge transfer
    Yi, Liping
    Shi, Xiaorong
    Wang, Nan
    Wang, Gang
    Liu, Xiaoguang
    Shi, Zhuan
    Yu, Han
    KNOWLEDGE-BASED SYSTEMS, 2024, 292
  • [33] Rethinking Personalized Federated Learning from Knowledge Perspective
    Yao, Dezhong
    Zhu, Ziquan
    Liu, Tongtong
    Xu, Zhiqiang
    Jin, Hai
    53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024, 2024, : 991 - 1000
  • [34] DKD-pFed: A novel framework for personalized federated learning via decoupling knowledge distillation and feature decorrelation
    Su, Liwei
    Wang, Donghao
    Zhu, Jinghua
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 259
  • [35] SCAN: A HealthCare Personalized ChatBot with Federated Learning Based GPT
    Puppala, Sai
    Hossain, Ismail
    Alam, Md Jahangir
    Talukder, Sajedul
    2024 IEEE 48TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC 2024, 2024, : 1945 - 1951
  • [36] Efficient Federated Learning for AIoT Applications Using Knowledge Distillation
    Liu, Tian
    Xia, Jun
    Ling, Zhiwei
    Fu, Xin
    Yu, Shui
    Chen, Mingsong
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (08) : 7229 - 7243
  • [37] A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation
    Zhou, Zhongchang
    Sun, Fenggang
    Chen, Xiangyu
    Zhang, Dongxu
    Han, Tianzhen
    Lan, Peng
    MATHEMATICS, 2023, 11 (14)
  • [38] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
  • [39] Mitigation of Membership Inference Attack by Knowledge Distillation on Federated Learning
    Ueda, Rei
    Nakai, Tsunato
    Yoshida, Kota
    Fujino, Takeshi
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2025, E108A (03) : 267 - 279
  • [40] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2022, 13 (01)