FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning

被引:3
|
作者
Xu, Yikai [1 ]
Fan, Hongbo [2 ]
机构
[1] Kunming Univ Sci & Technol, Sch Informat Engn & Automat, Kunming 650500, Peoples R China
[2] Kunming Univ Sci & Technol, Sch Fac Modern Agr Engn, Kunming 650500, Peoples R China
来源
IEEE ACCESS | 2023年 / 11卷
关键词
Federated learning; knowledge distillation; personalization; transfer learning; healthcare;
D O I
10.1109/ACCESS.2023.3294812
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
For most healthcare organizations, a significant challenge today is predicting diseases with incomplete data information, often resulting in isolation. Federated learning (FL) solves the issue of data silos by enabling remote local machines to train a globally optimal model collaboratively without the need for sharing data. In this research, we present FedDK, a serverless framework designed to obtain personalized models for each federation through data from local federations using convolutional neural networks and training through FL. Our approach involves using convolutional neural networks (CNNs) to accumulate common knowledge and transfer it using knowledge distillation, which helps prevent common knowledge forgetting. Additionally, the missing common knowledge is filled circularly between each federation, culminating in a personalized model for each group. This novel design leverages federated, deep, and integrated learning methods to produce more accurate machine-learning models. Our federated model exhibits superior performance to local and baseline FL methods, achieving significant advantages.
引用
收藏
页码:72409 / 72417
页数:9
相关论文
共 50 条
  • [1] MetaFed: Federated Learning Among Federations With Cyclic Knowledge Distillation for Personalized Healthcare
    Chen, Yiqiang
    Lu, Wang
    Qin, Xin
    Wang, Jindong
    Xie, Xing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 12
  • [2] MetaFed: Federated Learning Among Federations With Cyclic Knowledge Distillation for Personalized Healthcare
    Chen, Yiqiang
    Lu, Wang
    Qin, Xin
    Wang, Jindong
    Xie, Xing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16671 - 16682
  • [3] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
  • [4] PFL-DKD: Modeling decoupled knowledge fusion with distillation for improving personalized federated learning
    Ge, Huanhuan
    Pokhrel, Shiva Raj
    Liu, Zhenyu
    Wang, Jinlong
    Li, Gang
    COMPUTER NETWORKS, 2024, 254
  • [5] A Personalized Federated Learning Method Based on Clustering and Knowledge Distillation
    Zhang, Jianfei
    Shi, Yongqiang
    ELECTRONICS, 2024, 13 (05)
  • [6] A Personalized Federated Learning Algorithm Based on Meta-Learning and Knowledge Distillation
    Sun Y.
    Shi Y.
    Wang Z.
    Li M.
    Si P.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (01): : 12 - 18
  • [7] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    ELECTRONICS, 2024, 13 (17)
  • [8] Personalized Federated Learning Method Based on Collation Game and Knowledge Distillation
    Sun Y.
    Shi Y.
    Li M.
    Yang R.
    Si P.
    Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2023, 45 (10): : 3702 - 3709
  • [9] Personalized Federated Learning with Semisupervised Distillation
    Li, Xianxian
    Gong, Yanxia
    Liang, Yuan
    Wang, Li-e
    SECURITY AND COMMUNICATION NETWORKS, 2021, 2021
  • [10] Personalized and privacy-enhanced federated learning framework via knowledge distillation
    Yu, Fangchao
    Wang, Lina
    Zeng, Bo
    Zhao, Kai
    Yu, Rongwei
    NEUROCOMPUTING, 2024, 575