共 50 条
- [44] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
- [46] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
- [47] MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement APPLIED SCIENCES-BASEL, 2024, 14 (02):
- [48] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [49] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432
- [50] Learning Semantic Textual Similarity via Multi-Teacher Knowledge Distillation: A Multiple Data Augmentation method 2024 9TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS, ICCCS 2024, 2024, : 1197 - 1203