共 24 条
- [1] CAO Y, LIU H, WAN X J., Jointly Learning to Align and Summa⁃ rize for Neural Cross⁃Lingual Summarization, Proc of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 6220-6231, (2020)
- [2] CONNEAU A, WU S J, LI H R, Et al., Emerging Cross⁃Lingual Structure in Pretrained Language Models, Proc of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 6022-6034, (2020)
- [3] BHATTACHARYA P, GOYAL P, SARKAR S., Using Communities of Words Derived from Multilingual Word Vectors for Cross⁃Language Information Retrieval in Indian Languages, ACM Transactions on Asian and Low⁃Resource Language Information Processing, 18, 1, (2018)
- [4] JIANG Z L, EL-JAROUDI A, HARTMANN W, Et al., Cross⁃Lin⁃ gual Information Retrieval with BERT, Proc of the Workshop on Cross⁃Language Search and Summarization of Text and Speech, pp. 26-31, (2020)
- [5] DEVLIN J, CHANG M W, LEE K, Et al., BERT: Pre⁃training of Deep Bidirectional Transformers for Language Understanding, Proc of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Long and Short Papers), pp. 4171-4186, (2019)
- [6] CONNEAU A, KHANDELWAL K, GOYAL N, Et al., Unsupervised Cross⁃Lingual Representation Learning at Scale, Proc of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 8440-8451, (2020)
- [7] LIU Y H, OTT M, GOYAL N, Et al., RoBERTa: A Robustly Opti⁃ mized BERT Pretraining Approach [C / OL ]
- [8] YANG Z L, DAI Z H, YANG Y M, Et al., XLNet: Generalized Au⁃ toregressive Pretraining for Language Understanding, Proc of the 33rd International Conference on Neural Information Processing Sys⁃ tems, pp. 5753-5763, (2019)
- [9] CHANG W C, FELIX X Y, CHANG Y W, Et al., Pre⁃training Tasks for Embedding⁃Based Large⁃Scale Retrieval [C / OL]
- [10] NOGUEIRA R, CHO K., Passage Re⁃ranking with BERT