共 50 条
- [3] Neural Compatibility Modeling with Attentive Knowledge Distillation ACM/SIGIR PROCEEDINGS 2018, 2018, : 5 - 14
- [4] Leveraging logit uncertainty for better knowledge distillation SCIENTIFIC REPORTS, 2024, 14 (01):
- [5] DistillGrasp: Integrating Features Correlation With Knowledge Distillation for Depth Completion of Transparent Objects IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (10): : 8945 - 8952
- [6] Frustratingly Easy Knowledge Distillation via Attentive Similarity Matching 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2357 - 2363
- [8] KNOWLEDGE DISTILLATION WITH CATEGORY-AWARE ATTENTION AND DISCRIMINANT LOGIT LOSSES 2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2019, : 1792 - 1797
- [9] Convolution Attentive Knowledge Tracing with comprehensive behavioral features PROCEEDINGS OF THE ACM TURING AWARD CELEBRATION CONFERENCE-CHINA 2024, ACM-TURC 2024, 2024, : 48 - 52