共 50 条
- [22] Low-Resource Neural Machine Translation Using XLNet Pre-training Model ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 503 - 514
- [23] SPLAT: Speech-Language Joint Pre-Training for Spoken Language Understanding 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1897 - 1907
- [24] Character-Aware Low-Resource Neural Machine Translation with Weight Sharing and Pre-training CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 321 - 333
- [25] Unsupervised Corpus Aware Language Model Pre-training for Dense Passage Retrieval PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 2843 - 2853
- [26] Subsampling of Frequent Words in Text for Pre-training a Vision-Language Model PROCEEDINGS OF THE 1ST WORKSHOP ON LARGE GENERATIVE MODELS MEET MULTIMODAL APPLICATIONS, LGM3A 2023, 2023, : 61 - 67
- [27] Speech Model Pre-training for End-to-End Spoken Language Understanding INTERSPEECH 2019, 2019, : 814 - 818
- [28] Hybrid Approach Text Generation for Low-Resource Language ADVANCES IN COMPUTATIONAL COLLECTIVE INTELLIGENCE, ICCCI 2024, PART I, 2024, 2165 : 256 - 268
- [29] Fast and Efficient Multilingual Self-Supervised Pre-training for Low-Resource Speech Recognition INTERSPEECH 2023, 2023, : 2248 - 2252