共 50 条
- [1] Why Do Pretrained Language Models Help in Downstream Tasks? An Analysis of Head and Prompt Tuning ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
- [2] Evaluation of Pretrained Large Language Models in Embodied Planning Tasks ARTIFICIAL GENERAL INTELLIGENCE, AGI 2023, 2023, 13921 : 222 - 232
- [6] Data Augmentation for Spoken Language Understanding via Pretrained Language Models INTERSPEECH 2021, 2021, : 1219 - 1223
- [7] Visually-augmented Pretrained Language Models for NLP Tasks without Images PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 14912 - 14929
- [9] From Pretraining Data to Language Models to Downstream Tasks: Tracking the Trails of Political Biases Leading to Unfair NLP Models PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 11737 - 11762