共 50 条
- [43] An Extensive Study on Pre-trained Models for Program Understanding and Generation PROCEEDINGS OF THE 31ST ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2022, 2022, : 39 - 51
- [44] TinyMIM: An Empirical Study of Distilling MIM Pre-trained Models 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 3687 - 3697
- [45] A Study of Pre-trained Language Models in Natural Language Processing 2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
- [48] BERT for Sentiment Analysis: Pre-trained and Fine-Tuned Alternatives COMPUTATIONAL PROCESSING OF THE PORTUGUESE LANGUAGE, PROPOR 2022, 2022, 13208 : 209 - 218
- [49] Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3011 - 3020
- [50] Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1716 - 1731