共 50 条
- [41] Heterogeneous Knowledge Distillation using Information Flow Modeling 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 2336 - 2345
- [43] BAG-OF-FEATURES-BASED KNOWLEDGE DISTILLATION FOR LIGHTWEIGHT CONVOLUTIONAL NEURAL NETWORKS 2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 1541 - 1545
- [44] Conditional Response Augmentation for Dialogue using Knowledge Distillation INTERSPEECH 2020, 2020, : 3890 - 3894
- [45] Improving the accuracy of pruned network using knowledge distillation Pattern Analysis and Applications, 2021, 24 : 819 - 830
- [46] Improving Neural Topic Models using Knowledge Distillation PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1752 - 1771
- [48] Correction to: Embedded mutual learning: a novel online distillation method integrating diverse knowledge sources Applied Intelligence, 2023, 53 : 17240 - 17240
- [49] Transpose and Mask: Simple and Effective Logit-Based Knowledge Distillation for Multi-attribute and Multi-label Classification PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT X, 2024, 14434 : 273 - 284