共 50 条
- [21] Multi-Granularity Structural Knowledge Distillation for Language Model Compression PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1001 - 1011
- [22] Efficient Neural Data Compression for Machine Type Communications via Knowledge Distillation 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1169 - 1174
- [23] Model Conversion via Differentially Private Data-Free Distillation PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 2187 - 2195
- [25] Compressing Visual-linguistic Model via Knowledge Distillation 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 1408 - 1418
- [27] Differentially Private Knowledge Distillation for Mobile Analytics PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, : 1809 - 1812
- [28] Mitigating carbon footprint for knowledge distillation based deep learning model compression PLOS ONE, 2023, 18 (05):
- [30] DISCOVER THE EFFECTIVE STRATEGY FOR FACE RECOGNITION MODEL COMPRESSION BY IMPROVED KNOWLEDGE DISTILLATION 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 2416 - 2420