共 50 条
- [41] ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3350 - 3363
- [42] Probing Pre-trained Auto-regressive Language Models for Named Entity Typing and Recognition LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 1408 - 1417
- [43] TOKEN Is a MASK: Few-shot Named Entity Recognition with Pre-trained Language Models TEXT, SPEECH, AND DIALOGUE (TSD 2022), 2022, 13502 : 138 - 150
- [44] A Study of Pre-trained Language Models in Natural Language Processing 2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
- [45] An Opinion Summarization-Evaluation System Based on Pre-trained Models ROUGH SETS (IJCRS 2021), 2021, 12872 : 225 - 230
- [46] Transfer Learning from Pre-trained Language Models Improves End-to-End Speech Summarization INTERSPEECH 2023, 2023, : 2943 - 2947
- [47] From Cloze to Comprehension: Retrofitting Pre-trained Masked Language Models to Pre-trained Machine Reader ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [49] Probing Pre-Trained Language Models for Disease Knowledge FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033
- [50] Analyzing Individual Neurons in Pre-trained Language Models PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4865 - 4880