共 50 条
- [1] Are Pre-trained Convolutions Better than Pre-trained Transformers? 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4349 - 4359
- [2] Conditional pre-trained attention based Chinese question generation CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2021, 33 (20):
- [3] Scalable Educational Question Generation with Pre-trained Language Models ARTIFICIAL INTELLIGENCE IN EDUCATION, AIED 2023, 2023, 13916 : 327 - 339
- [4] Can LLMs Facilitate Interpretation of Pre-trained Language Models? 2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 3248 - 3268
- [5] An Extensive Study on Pre-trained Models for Program Understanding and Generation PROCEEDINGS OF THE 31ST ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2022, 2022, : 39 - 51
- [6] UniRaG: Unification, Retrieval, and Generation for Multimodal Question Answering With Pre-Trained Language Models IEEE ACCESS, 2024, 12 : 71505 - 71519
- [7] Pre-trained Language Model for Biomedical Question Answering MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 1168 : 727 - 740