共 50 条
- [21] BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 2281 - 2290
- [22] An empirical study of pre-trained language models in simple knowledge graph question answering World Wide Web, 2023, 26 : 2855 - 2886
- [24] An Empirical study on Pre-trained Embeddings and Language Models for Bot Detection 4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019), 2019, : 148 - 155
- [25] Disentangling Semantics and Syntax in Sentence Embeddings with Pre-trained Language Models 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1372 - 1379
- [26] AID: Active Distillation Machine to Leverage Pre-Trained Black-Box Models in Private Data Settings PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 3569 - 3581
- [27] Exploiting Pre-Trained Network Embeddings for Recommendations in Social Networks Journal of Computer Science and Technology, 2018, 33 : 682 - 696
- [29] Probing Simile Knowledge from Pre-trained Language Models PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 5875 - 5887
- [30] Text-Augmented Open Knowledge Graph Completion via Pre-Trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 11161 - 11180