共 50 条
- [22] Communication Traffic Prediction with Continual Knowledge Distillation IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 5481 - 5486
- [23] Embracing the Dark Knowledge: Domain Generalization Using Regularized Knowledge Distillation PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2595 - 2604
- [24] Knowledge Distillation for Semi-supervised Domain Adaptation OR 2.0 CONTEXT-AWARE OPERATING THEATERS AND MACHINE LEARNING IN CLINICAL NEUROIMAGING, 2019, 11796 : 68 - 76
- [25] MSTNet-KD: Multilevel Transfer Networks Using Knowledge Distillation for the Dense Prediction of Remote-Sensing Images IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 12
- [27] Representation Learning and Knowledge Distillation for Lightweight Domain Adaptation 2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 1202 - 1207
- [28] DaFKD: Domain-aware Federated Knowledge Distillation 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 20412 - 20421
- [30] Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,