共 50 条
- [42] Preservation of the Global Knowledge by Not-True Distillation in Federated Learning ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [43] Resource Allocation for Federated Knowledge Distillation Learning in Internet of Drones IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 8064 - 8074
- [44] Data-Free Knowledge Distillation for Heterogeneous Federated Learning INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
- [45] Knowledge-Aware Parameter Coaching for Personalized Federated Learning THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 17069 - 17077
- [46] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
- [47] Robust Multi-model Personalized Federated Learning via Model Distillation ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT III, 2022, 13157 : 432 - 446
- [50] A Network Resource Aware Federated Learning Approach using Knowledge Distillation IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021), 2021,