共 50 条
- [31] A Task-Efficient Gradient Guide Knowledge Distillation for Pre-train Language Model Compression ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024, 2024, 14877 : 366 - 377
- [33] KD-MRI: A knowledge distillation framework for image reconstruction and image restoration in MRI workflow MEDICAL IMAGING WITH DEEP LEARNING, VOL 121, 2020, 121 : 515 - 526
- [34] Model Compression Based on Knowledge Distillation and Its Application in HRRP PROCEEDINGS OF 2020 IEEE 4TH INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2020), 2020, : 1268 - 1272
- [35] PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation INTERSPEECH 2021, 2021, : 4568 - 4572
- [38] DiffIR: Efficient Diffusion Model for Image Restoration 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 13049 - 13059
- [40] Self Regulated Learning Mechanism for Data Efficient Knowledge Distillation 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,