共 50 条
- [21] DISCOVER THE EFFECTIVE STRATEGY FOR FACE RECOGNITION MODEL COMPRESSION BY IMPROVED KNOWLEDGE DISTILLATION 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 2416 - 2420
- [22] Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 553 - 565
- [24] Knowledge Distillation for Sequence Model 19TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2018), VOLS 1-6: SPEECH RESEARCH FOR EMERGING MARKETS IN MULTILINGUAL SOCIETIES, 2018, : 3703 - 3707
- [26] AUGMENTING KNOWLEDGE DISTILLATION WITH PEER-TO-PEER MUTUAL LEARNING FOR MODEL COMPRESSION 2022 IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (IEEE ISBI 2022), 2022,
- [27] Model compression via pruning and knowledge distillation for person re-identification Journal of Ambient Intelligence and Humanized Computing, 2021, 12 : 2149 - 2161
- [28] Model Compression by Iterative Pruning with Knowledge Distillation and Its Application to Speech Enhancement INTERSPEECH 2022, 2022, : 941 - 945
- [29] Attention-Fused CNN Model Compression with Knowledge Distillation for Brain Tumor Segmentation MEDICAL IMAGE UNDERSTANDING AND ANALYSIS, MIUA 2022, 2022, 13413 : 328 - 338
- [30] Compression of Time Series Classification Model MC-MHLF using Knowledge Distillation 2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 22 - 27