共 50 条
- [41] BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 2281 - 2290
- [42] An empirical study of pre-trained language models in simple knowledge graph question answering World Wide Web, 2023, 26 : 2855 - 2886
- [44] Grounding Dialogue Systems via Knowledge Graph Aware Decoding with Pre-trained Transformers SEMANTIC WEB, ESWC 2021, 2021, 12731 : 323 - 339
- [45] Using Noise and External Knowledge to Enhance Chinese Pre-trained Model 2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 476 - 480
- [46] Acquiring Knowledge from Pre-Trained Model to Neural Machine Translation THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9266 - 9273
- [50] Multimodal Topic and Sentiment Recognition for Chinese Data Based on Pre-trained Encoders PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VII, 2024, 14431 : 323 - 334