共 50 条
- [1] Efficient Knowledge Distillation from an Ensemble of Teachers 18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION, 2017, : 3697 - 3701
- [2] Communication-efficient Federated Learning for UAV Networks with Knowledge Distillation and Transfer Learning IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5739 - 5744
- [4] Periodic Intra-ensemble Knowledge Distillation for Reinforcement Learning MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, 2021, 12975 : 87 - 103
- [6] Feature-Level Ensemble Knowledge Distillation for Aggregating Knowledge from Multiple Networks ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1411 - 1418
- [7] Improved knowledge distillation method with curriculum learning paradigm Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS, 2022, 28 (07): : 2075 - 2082
- [10] Learning Efficient Object Detection Models with Knowledge Distillation ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30