共 50 条
- [43] Preservation of the Global Knowledge by Not-True Distillation in Federated Learning ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [44] Resource Allocation for Federated Knowledge Distillation Learning in Internet of Drones IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 8064 - 8074
- [45] Data-Free Knowledge Distillation for Heterogeneous Federated Learning INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
- [46] Communication-Efficient Personalized Federated Edge Learning for Decentralized Sensing in ISAC 2023 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS, 2023, : 207 - 212
- [47] Knowledge-Aware Parameter Coaching for Personalized Federated Learning THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 17069 - 17077
- [48] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
- [49] Robust Multi-model Personalized Federated Learning via Model Distillation ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT III, 2022, 13157 : 432 - 446