共 50 条
- [42] Controlling the Focus of Pretrained Language Generation Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3291 - 3306
- [43] Language Recognition Based on Unsupervised Pretrained Models INTERSPEECH 2021, 2021, : 3271 - 3275
- [44] Fooling MOSS Detection with Pretrained Language Models PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 2933 - 2943
- [45] Factual Consistency of Multilingual Pretrained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3046 - 3052
- [47] Pretrained Language Models for Text Generation: A Survey PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 4492 - 4499
- [48] Pretrained Language Models for Sequential Sentence Classification 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3693 - 3699