Selective knowledge sharing for privacy-preserving federated distillation without a good teacher

被引:13
|
作者
Shao, Jiawei [1 ]
Wu, Fangzhao [2 ]
Zhang, Jun [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
[2] Microsoft Res Asia, Beijing, Peoples R China
关键词
D O I
10.1038/s41467-023-44383-9
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
While federated learning (FL) is promising for efficient collaborative learning without revealing local data, it remains vulnerable to white-box privacy attacks, suffers from high communication overhead, and struggles to adapt to heterogeneous models. Federated distillation (FD) emerges as an alternative paradigm to tackle these challenges, which transfers knowledge among clients instead of model parameters. Nevertheless, challenges arise due to variations in local data distributions and the absence of a well-trained teacher model, which leads to misleading and ambiguous knowledge sharing that significantly degrades model performance. To address these issues, this paper proposes a selective knowledge sharing mechanism for FD, termed Selective-FD, to identify accurate and precise knowledge from local and ensemble predictions, respectively. Empirical studies, backed by theoretical insights, demonstrate that our approach enhances the generalization capabilities of the FD framework and consistently outperforms baseline methods. We anticipate our study to enable a privacy-preserving, communication-efficient, and heterogeneity-adaptive federated training framework. While federated learning is promising for efficient collaborative learning without revealing local data, it remains vulnerable to white-box privacy attacks, suffers from high communication overhead, and struggles to adapt to heterogeneous models. Here, the authors show a federated distillation method to tackle these challenges, which leverages the strengths of knowledge distillation in a federated learning setting.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Selective knowledge sharing for privacy-preserving federated distillation without a good teacher
    Jiawei Shao
    Fangzhao Wu
    Jun Zhang
    Nature Communications, 15
  • [2] Federated Knowledge Recycling: Privacy-preserving synthetic data sharing
    Lomurno, Eugenio
    Matteucci, Matteo
    PATTERN RECOGNITION LETTERS, 2025, 191 : 124 - 130
  • [3] Privacy-Preserving Federated Data Sharing
    Fioretto, Ferdinando
    Van Hentenryck, Pascal
    AAMAS '19: PROCEEDINGS OF THE 18TH INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS AND MULTIAGENT SYSTEMS, 2019, : 638 - 646
  • [4] FedGKD: Federated Graph Knowledge Distillation for privacy-preserving rumor detection
    Zheng, Peng
    Dou, Yong
    Yan, Yeqing
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [5] Ensemble Attention Distillation for Privacy-Preserving Federated Learning
    Gong, Xuan
    Sharma, Abhishek
    Karanam, Srikrishna
    Wu, Ziyan
    Chen, Terrence
    Doermann, David
    Innanje, Arun
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 15056 - 15066
  • [6] Federated Learning With Privacy-Preserving Ensemble Attention Distillation
    Gong, Xuan
    Song, Liangchen
    Vedula, Rishi
    Sharma, Abhishek
    Zheng, Meng
    Planche, Benjamin
    Innanje, Arun
    Chen, Terrence
    Yuan, Junsong
    Doermann, David
    Wu, Ziyan
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2023, 42 (07) : 2057 - 2067
  • [7] Complementary Knowledge Distillation for Robust and Privacy-Preserving Model Serving in Vertical Federated Learning
    Gao, Dashan
    Wan, Sheng
    Fan, Lixin
    Yao, Xin
    Yang, Qiang
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 18, 2024, : 19832 - 19839
  • [8] Privacy-Preserving Federated Distillation GAN for CIDSs in Industrial CPSs
    Liang, Junwei
    Sadiq, Muhammad
    Cai, Tie
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5512 - 5517
  • [9] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge
    Pan, Yanghe
    Su, Zhou
    Ni, Jianbing
    Wang, Yuntao
    Zhou, Jinhao
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
  • [10] PFDP: privacy-preserving federated distillation method for pretrained language models
    Chen, Chaomeng
    Su, Sen
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2025,