共 50 条
- [31] Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 553 - 565
- [34] AUGMENTING KNOWLEDGE DISTILLATION WITH PEER-TO-PEER MUTUAL LEARNING FOR MODEL COMPRESSION 2022 IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (IEEE ISBI 2022), 2022,
- [35] Model Compression by Iterative Pruning with Knowledge Distillation and Its Application to Speech Enhancement INTERSPEECH 2022, 2022, : 941 - 945
- [36] Accumulation Knowledge Distillation for Conditional GAN Compression 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 1294 - 1303
- [37] Adaptive Contrastive Knowledge Distillation for BERT Compression FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 8941 - 8953
- [39] Attention-Fused CNN Model Compression with Knowledge Distillation for Brain Tumor Segmentation MEDICAL IMAGE UNDERSTANDING AND ANALYSIS, MIUA 2022, 2022, 13413 : 328 - 338
- [40] Compression of Time Series Classification Model MC-MHLF using Knowledge Distillation 2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 22 - 27