FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation

被引:6
|
作者
Chen, Leiming [1 ]
Zhang, Weishan [1 ]
Dong, Cihao [1 ]
Zhao, Dehai [2 ]
Zeng, Xingjie [3 ]
Qiao, Sibo [4 ]
Zhu, Yichang [1 ]
Tan, Chee Wei [5 ]
机构
[1] China Univ Petr East China, Sch Comp Sci & Technol, Qingdao 266580, Peoples R China
[2] CSIRO Data61, Sydney 2015, Australia
[3] Southwest Petr Univ, Sch Comp Sci, Chengdu 610500, Peoples R China
[4] Tiangong Univ, Sch Software, Tianjin 300387, Peoples R China
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
关键词
heterogeneous federated learning; adaptive knowledge distillation; malicious client identification; trustworthy knowledge aggregation;
D O I
10.3390/e26010096
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Federated learning allows multiple parties to train models while jointly protecting user privacy. However, traditional federated learning requires each client to have the same model structure to fuse the global model. In real-world scenarios, each client may need to develop personalized models based on its environment, making it difficult to perform federated learning in a heterogeneous model environment. Some knowledge distillation methods address the problem of heterogeneous model fusion to some extent. However, these methods assume that each client is trustworthy. Some clients may produce malicious or low-quality knowledge, making it difficult to aggregate trustworthy knowledge in a heterogeneous environment. To address these challenges, we propose a trustworthy heterogeneous federated learning framework (FedTKD) to achieve client identification and trustworthy knowledge fusion. Firstly, we propose a malicious client identification method based on client logit features, which can exclude malicious information in fusing global logit. Then, we propose a selectivity knowledge fusion method to achieve high-quality global logit computation. Additionally, we propose an adaptive knowledge distillation method to improve the accuracy of knowledge transfer from the server side to the client side. Finally, we design different attack and data distribution scenarios to validate our method. The experiment shows that our method outperforms the baseline methods, showing stable performance in all attack scenarios and achieving an accuracy improvement of 2% to 3% in different data distributions.
引用
收藏
页数:31
相关论文
共 50 条
  • [21] Personalized Federated Learning for Heterogeneous Edge Device: Self-Knowledge Distillation Approach
    Singh, Neha
    Rupchandani, Jatin
    Adhikari, Mainak
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 4625 - 4632
  • [22] TrustAL: Trustworthy Active Learning Using Knowledge Distillation
    Kwak, Beong-woo
    Kim, Youngwook
    Kim, Yu Jin
    Hwang, Seung-won
    Yeo, Jinyoung
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 7263 - 7271
  • [23] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [24] A Personalized Federated Learning Algorithm Based on Meta-Learning and Knowledge Distillation
    Sun Y.
    Shi Y.
    Wang Z.
    Li M.
    Si P.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (01): : 12 - 18
  • [25] WHEN FEDERATED LEARNING MEETS KNOWLEDGE DISTILLATION
    Pang, Xiaoyi
    Hu, Jiahui
    Sun, Peng
    Ren, Ju
    Wang, Zhibo
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (05) : 208 - 214
  • [26] Knowledge Distillation in Federated Learning: A Practical Guide
    Mora, Alessio
    Tenison, Irene
    Bellavista, Paolo
    Rish, Irina
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8188 - 8196
  • [27] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
  • [28] FedDKD: Federated learning with decentralized knowledge distillation
    Li, Xinjia
    Chen, Boyu
    Lu, Wenlian
    APPLIED INTELLIGENCE, 2023, 53 (15) : 18547 - 18563
  • [29] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563
  • [30] A Federated Domain Adaptation Algorithm Based on Knowledge Distillation and Contrastive Learning
    HUANG Fang
    FANG Zhijun
    SHI Zhicai
    ZHUANG Lehui
    LI Xingchen
    HUANG Bo
    WuhanUniversityJournalofNaturalSciences, 2022, 27 (06) : 499 - 507