共 50 条
- [22] Chinese Grammatical Correction Using BERT-based Pre-trained Model 1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 163 - 168
- [24] Pre-trained Convolutional Networks and Generative Statistical Models: A Comparative Study in Large Datasets PATTERN RECOGNITION AND IMAGE ANALYSIS (IBPRIA 2017), 2017, 10255 : 69 - 75
- [25] Learning to Remove: Towards Isotropic Pre-trained BERT Embedding ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 448 - 459
- [28] Transfer Learning from Pre-trained BERT for Pronoun Resolution GENDER BIAS IN NATURAL LANGUAGE PROCESSING (GEBNLP 2019), 2019, : 82 - 88
- [29] BERT is to NLP what AlexNet is to CV: Can Pre-Trained Language Models Identify Analogies? 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3609 - 3624
- [30] Is BERT a Cross-Disciplinary Knowledge Learner? A Surprising Finding of Pre-trained Models' Transferability FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2195 - 2208