共 50 条
- [21] Improving Bi-encoder Document Ranking Models with Two Rankers and Multi-teacher Distillation SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 2192 - 2196
- [22] MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution COMPUTER VISION - ECCV 2024, PT XXXIX, 2025, 15097 : 364 - 382
- [23] Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (18):
- [24] Let All Be Whitened: Multi-Teacher Distillation for Efficient Visual Retrieval THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 5, 2024, : 4126 - 4135
- [26] MTKDSR: Multi-Teacher Knowledge Distillation for Super Resolution Image Reconstruction 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 352 - 358
- [28] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [29] Unsupervised Domain Adaptation in Medical Image Segmentation via Fourier Feature Decoupling and Multi-teacher Distillation ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT VI, ICIC 2024, 2024, 14867 : 98 - 110