共 50 条
- [31] Knowledge-Augmented Reasoning Distillation for Small Language Models in Knowledge-Intensive Tasks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [32] Knowledge Representing: Efficient, Sparse Representation of Prior Knowledge for Knowledge Distillation 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2019), 2019, : 648 - 656
- [34] Causal Distillation for Language Models NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 4288 - 4295
- [35] Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS, PT I, NLDB 2024, 2024, 14762 : 32 - 46
- [36] Knowledge Base Grounded Pre-trained Language Models via Distillation 39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 1617 - 1625
- [37] Mixture-of-Linguistic-Experts Adapters for Improving and Interpreting Pre-trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 9456 - 9469