FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning

被引:3
|
作者
Xu, Yikai [1 ]
Fan, Hongbo [2 ]
机构
[1] Kunming Univ Sci & Technol, Sch Informat Engn & Automat, Kunming 650500, Peoples R China
[2] Kunming Univ Sci & Technol, Sch Fac Modern Agr Engn, Kunming 650500, Peoples R China
来源
IEEE ACCESS | 2023年 / 11卷
关键词
Federated learning; knowledge distillation; personalization; transfer learning; healthcare;
D O I
10.1109/ACCESS.2023.3294812
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
For most healthcare organizations, a significant challenge today is predicting diseases with incomplete data information, often resulting in isolation. Federated learning (FL) solves the issue of data silos by enabling remote local machines to train a globally optimal model collaboratively without the need for sharing data. In this research, we present FedDK, a serverless framework designed to obtain personalized models for each federation through data from local federations using convolutional neural networks and training through FL. Our approach involves using convolutional neural networks (CNNs) to accumulate common knowledge and transfer it using knowledge distillation, which helps prevent common knowledge forgetting. Additionally, the missing common knowledge is filled circularly between each federation, culminating in a personalized model for each group. This novel design leverages federated, deep, and integrated learning methods to produce more accurate machine-learning models. Our federated model exhibits superior performance to local and baseline FL methods, achieving significant advantages.
引用
收藏
页码:72409 / 72417
页数:9
相关论文
共 50 条
  • [21] FedDKD: Federated learning with decentralized knowledge distillation
    Li, Xinjia
    Chen, Boyu
    Lu, Wenlian
    APPLIED INTELLIGENCE, 2023, 53 (15) : 18547 - 18563
  • [22] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [23] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563
  • [24] Parameterized Knowledge Transfer for Personalized Federated Learning
    Zhang, Jie
    Guo, Song
    Ma, Xiaosong
    Wang, Haozhao
    Xu, Wencao
    Wu, Feijie
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [25] Personalized Edge Intelligence via Federated Self-Knowledge Distillation
    Jin, Hai
    Bai, Dongshan
    Yao, Dezhong
    Dai, Yutong
    Gu, Lin
    Yu, Chen
    Sun, Lichao
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (02) : 567 - 580
  • [26] Cyclic Federated Learning Method Based on Distribution Information Sharing and Knowledge Distillation for Medical Data
    Yu, Liang
    Huang, Jianjun
    ELECTRONICS, 2022, 11 (23)
  • [27] Resource-Aware Knowledge Distillation for Federated Learning
    Chen, Zheyi
    Tian, Pu
    Liao, Weixian
    Chen, Xuhui
    Xu, Guobin
    Yu, Wei
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) : 706 - 719
  • [28] DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION
    Huang, Yue
    Kong, Lanju
    Li, Qingzhong
    Zhang, Baochen
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 342 - 347
  • [29] Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation
    Mohammed, Malik Naik
    Zhang, Xinyue
    Valero, Maria
    Xie, Ying
    2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE, 2023, : 207 - 208
  • [30] Federated Split Learning via Mutual Knowledge Distillation
    Luo, Linjun
    Zhang, Xinglin
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 2729 - 2741