MetaFed: Federated Learning Among Federations With Cyclic Knowledge Distillation for Personalized Healthcare

被引:0
|
作者
Chen, Yiqiang [1 ]
Lu, Wang [1 ]
Qin, Xin [1 ]
Wang, Jindong [2 ]
Xie, Xing [2 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
[2] Microsoft Res Asia, Beijing 100080, Peoples R China
关键词
Servers; Data models; Training; Adaptation models; Data privacy; Machine learning; Costs; Federated learning (FL); healthcare; knowledge distillation (KD); personalization; transfer learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) has attracted increasing attention to building models without accessing raw user data, especially in healthcare. In real applications, different federations can seldom work together due to possible reasons such as data heterogeneity and distrust/inexistence of the central server. In this article, we propose a novel framework called MetaFed to facilitate trustworthy FL between different federations. MetaFed obtains a personalized model for each federation without a central server via the proposed cyclic knowledge distillation. Specifically, MetaFed treats each federation as a meta distribution and aggregates knowledge of each federation in a cyclic manner. The training is split into two parts: common knowledge accumulation and personalization. Comprehensive experiments on seven benchmarks demonstrate that MetaFed without a server achieves better accuracy compared with state-of-the-art methods [e.g., 10%+ accuracy improvement compared with the baseline for physical activity monitoring dataset (PAMAP2)] with fewer communication costs. More importantly, MetaFed shows remarkable performance in real-healthcare-related applications.
引用
收藏
页码:16671 / 16682
页数:12
相关论文
共 50 条
  • [21] Knowledge Distillation in Federated Learning: A Practical Guide
    Mora, Alessio
    Tenison, Irene
    Bellavista, Paolo
    Rish, Irina
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8188 - 8196
  • [22] FedDKD: Federated learning with decentralized knowledge distillation
    Li, Xinjia
    Chen, Boyu
    Lu, Wenlian
    APPLIED INTELLIGENCE, 2023, 53 (15) : 18547 - 18563
  • [23] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [24] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563
  • [25] Parameterized Knowledge Transfer for Personalized Federated Learning
    Zhang, Jie
    Guo, Song
    Ma, Xiaosong
    Wang, Haozhao
    Xu, Wencao
    Wu, Feijie
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [26] Personalized Edge Intelligence via Federated Self-Knowledge Distillation
    Jin, Hai
    Bai, Dongshan
    Yao, Dezhong
    Dai, Yutong
    Gu, Lin
    Yu, Chen
    Sun, Lichao
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (02) : 567 - 580
  • [27] Cyclic Federated Learning Method Based on Distribution Information Sharing and Knowledge Distillation for Medical Data
    Yu, Liang
    Huang, Jianjun
    ELECTRONICS, 2022, 11 (23)
  • [28] Resource-Aware Knowledge Distillation for Federated Learning
    Chen, Zheyi
    Tian, Pu
    Liao, Weixian
    Chen, Xuhui
    Xu, Guobin
    Yu, Wei
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) : 706 - 719
  • [29] DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION
    Huang, Yue
    Kong, Lanju
    Li, Qingzhong
    Zhang, Baochen
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 342 - 347
  • [30] Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation
    Mohammed, Malik Naik
    Zhang, Xinyue
    Valero, Maria
    Xie, Ying
    2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE, 2023, : 207 - 208