共 50 条
- [11] Knowledge Distillation Beyond Model Compression 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6136 - 6143
- [12] Contrastive Visual Semantic Pretraining Magnifies the Semantics of Natural Language Representations PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 3050 - 3061
- [13] SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning Applied Intelligence, 2023, 53 : 28520 - 28541
- [15] Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3026 - 3036
- [16] Contrastive Model Inversion for Data-Free Knowledge Distillation PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2374 - 2380
- [17] Model Selection - Knowledge Distillation Framework for Model Compression 2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
- [18] CONTRACLM: Contrastive Learning For Causal Language Model PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 6436 - 6459
- [19] Effective Compression of Language Models by Combining Pruning and Knowledge Distillation 2024 IEEE 48TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC 2024, 2024, : 429 - 438
- [20] Patient Knowledge Distillation for BERT Model Compression 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 4323 - 4332