共 50 条
- [32] Learning Semantic Textual Similarity via Multi-Teacher Knowledge Distillation: A Multiple Data Augmentation method 2024 9TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS, ICCCS 2024, 2024, : 1197 - 1203
- [34] CIMTD: Class Incremental Multi-Teacher Knowledge Distillation for Fractal Object Detection PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT XII, 2025, 15042 : 51 - 65
- [35] A Multi-Teacher Assisted Knowledge Distillation Approach for Enhanced Face Image Authentication PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 135 - 143
- [37] Enhancing BERT Performance: Multi-teacher Adversarial Distillation with Clean and Robust Guidance CONCEPTUAL MODELING, ER 2024, 2025, 15238 : 3 - 17
- [38] FM-LiteLearn: A Lightweight Brain Tumor Classification Framework Integrating Image Fusion and Multi-teacher Distillation Strategies ARTIFICIAL INTELLIGENCE IN HEALTHCARE, PT II, AIIH 2024, 2024, 14976 : 89 - 103