FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning

被引:3
|
作者
Xu, Yikai [1 ]
Fan, Hongbo [2 ]
机构
[1] Kunming Univ Sci & Technol, Sch Informat Engn & Automat, Kunming 650500, Peoples R China
[2] Kunming Univ Sci & Technol, Sch Fac Modern Agr Engn, Kunming 650500, Peoples R China
来源
IEEE ACCESS | 2023年 / 11卷
关键词
Federated learning; knowledge distillation; personalization; transfer learning; healthcare;
D O I
10.1109/ACCESS.2023.3294812
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
For most healthcare organizations, a significant challenge today is predicting diseases with incomplete data information, often resulting in isolation. Federated learning (FL) solves the issue of data silos by enabling remote local machines to train a globally optimal model collaboratively without the need for sharing data. In this research, we present FedDK, a serverless framework designed to obtain personalized models for each federation through data from local federations using convolutional neural networks and training through FL. Our approach involves using convolutional neural networks (CNNs) to accumulate common knowledge and transfer it using knowledge distillation, which helps prevent common knowledge forgetting. Additionally, the missing common knowledge is filled circularly between each federation, culminating in a personalized model for each group. This novel design leverages federated, deep, and integrated learning methods to produce more accurate machine-learning models. Our federated model exhibits superior performance to local and baseline FL methods, achieving significant advantages.
引用
收藏
页码:72409 / 72417
页数:9
相关论文
共 50 条
  • [41] Preservation of the Global Knowledge by Not-True Distillation in Federated Learning
    Lee, Gihun
    Jeong, Minchan
    Shin, Yongjin
    Bae, Sangmin
    Yun, Se-Young
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [42] Resource Allocation for Federated Knowledge Distillation Learning in Internet of Drones
    Yao, Jingjing
    Cal, Semih
    Sun, Xiang
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 8064 - 8074
  • [43] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [44] Knowledge-Aware Parameter Coaching for Personalized Federated Learning
    Zhi, Mingjian
    Bi, Yuanguo
    Xu, Wenchao
    Wang, Haozhao
    Xiang, Tianao
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 17069 - 17077
  • [45] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge
    Pan, Yanghe
    Su, Zhou
    Ni, Jianbing
    Wang, Yuntao
    Zhou, Jinhao
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
  • [46] Robust Multi-model Personalized Federated Learning via Model Distillation
    Muhammad, Adil
    Lin, Kai
    Gao, Jian
    Chen, Bincai
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT III, 2022, 13157 : 432 - 446
  • [47] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [48] Cross-Training with Prototypical Distillation for improving the generalization of Federated Learning
    Liu, Tianhan
    Qi, Zhuang
    Chen, Zitan
    Meng, Xiangxu
    Meng, Lei
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 648 - 653
  • [49] Heterogeneous Collaborative Learning for Personalized Healthcare Analytics via Messenger Distillation
    Ye, Guanhua
    Chen, Tong
    Li, Yawen
    Cui, Lizhen
    Quoc Viet Hung Nguyen
    Yin, Hongzhi
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (11) : 5249 - 5259
  • [50] A Network Resource Aware Federated Learning Approach using Knowledge Distillation
    Mishra, Rahul
    Gupta, Hari Prabhat
    Dutta, Tanima
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021), 2021,