共 50 条
- [1] Elastic Deep Learning Using Knowledge Distillation with Heterogeneous Computing Resources EURO-PAR 2021: PARALLEL PROCESSING WORKSHOPS, 2022, 13098 : 116 - 128
- [2] Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning 39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 997 - 1006
- [4] Data-Free Knowledge Distillation for Heterogeneous Federated Learning INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
- [5] Heterogeneous Knowledge Distillation using Information Flow Modeling 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 2336 - 2345
- [9] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning 2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
- [10] Keyword Spotting with Synthetic Data using Heterogeneous Knowledge Distillation INTERSPEECH 2022, 2022, : 1397 - 1401