MetaFed: Federated Learning Among Federations With Cyclic Knowledge Distillation for Personalized Healthcare

被引:0
|
作者
Chen, Yiqiang [1 ]
Lu, Wang [1 ]
Qin, Xin [1 ]
Wang, Jindong [2 ]
Xie, Xing [2 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
[2] Microsoft Res Asia, Beijing 100080, Peoples R China
关键词
Servers; Data models; Training; Adaptation models; Data privacy; Machine learning; Costs; Federated learning (FL); healthcare; knowledge distillation (KD); personalization; transfer learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) has attracted increasing attention to building models without accessing raw user data, especially in healthcare. In real applications, different federations can seldom work together due to possible reasons such as data heterogeneity and distrust/inexistence of the central server. In this article, we propose a novel framework called MetaFed to facilitate trustworthy FL between different federations. MetaFed obtains a personalized model for each federation without a central server via the proposed cyclic knowledge distillation. Specifically, MetaFed treats each federation as a meta distribution and aggregates knowledge of each federation in a cyclic manner. The training is split into two parts: common knowledge accumulation and personalization. Comprehensive experiments on seven benchmarks demonstrate that MetaFed without a server achieves better accuracy compared with state-of-the-art methods [e.g., 10%+ accuracy improvement compared with the baseline for physical activity monitoring dataset (PAMAP2)] with fewer communication costs. More importantly, MetaFed shows remarkable performance in real-healthcare-related applications.
引用
收藏
页码:16671 / 16682
页数:12
相关论文
共 50 条
  • [1] MetaFed: Federated Learning Among Federations With Cyclic Knowledge Distillation for Personalized Healthcare
    Chen, Yiqiang
    Lu, Wang
    Qin, Xin
    Wang, Jindong
    Xie, Xing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 12
  • [2] FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning
    Xu, Yikai
    Fan, Hongbo
    IEEE ACCESS, 2023, 11 : 72409 - 72417
  • [3] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
  • [4] A Personalized Federated Learning Method Based on Clustering and Knowledge Distillation
    Zhang, Jianfei
    Shi, Yongqiang
    ELECTRONICS, 2024, 13 (05)
  • [5] MiniPFL: Mini federations for hierarchical personalized federated learning
    Fan, Yuwei
    Xi, Wei
    Zhu, Hengyi
    Zhao, Jizhong
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2024, 157 : 41 - 50
  • [6] A Personalized Federated Learning Algorithm Based on Meta-Learning and Knowledge Distillation
    Sun Y.
    Shi Y.
    Wang Z.
    Li M.
    Si P.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (01): : 12 - 18
  • [7] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    ELECTRONICS, 2024, 13 (17)
  • [8] Personalized Federated Learning Method Based on Collation Game and Knowledge Distillation
    Sun Y.
    Shi Y.
    Li M.
    Yang R.
    Si P.
    Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2023, 45 (10): : 3702 - 3709
  • [9] Personalized Federated Learning with Semisupervised Distillation
    Li, Xianxian
    Gong, Yanxia
    Liang, Yuan
    Wang, Li-e
    SECURITY AND COMMUNICATION NETWORKS, 2021, 2021
  • [10] Personalized and privacy-enhanced federated learning framework via knowledge distillation
    Yu, Fangchao
    Wang, Lina
    Zeng, Bo
    Zhao, Kai
    Yu, Rongwei
    NEUROCOMPUTING, 2024, 575