共 50 条
- [1] Grounding Dialogue History: Strengths and Weaknesses of Pre-trained Transformers AIXIA 2020 - ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 12414 : 263 - 279
- [2] Are Pre-trained Convolutions Better than Pre-trained Transformers? 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4349 - 4359
- [3] Calibration of Pre-trained Transformers PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 295 - 302
- [4] Knowledge Grounded Pre-Trained Model For Dialogue Response Generation 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
- [5] CopiFilter: An Auxiliary Module Adapts Pre-trained Transformers for Medical Dialogue Summarization ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 99 - 114
- [6] Emergent Modularity in Pre-trained Transformers FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 4066 - 4083
- [9] Knowledge-Grounded Dialogue Generation with Pre-trained Language Models PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 3377 - 3390
- [10] Unsupervised Out-of-Domain Detection via Pre-trained Transformers 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 1052 - 1061