A Pre-trained Universal Knowledge Graph Reasoning Model Based on Rule Prompts

被引:0
|
作者
Cui, Yuanning [1 ]
Sun, Zequn [1 ]
Hu, Wei [1 ]
机构
[1] State Key Laboratory for Novel Software Technology, Nanjing University, Nanjing,210023, China
基金
中国国家自然科学基金;
关键词
D O I
10.7544/issn1000-1239.202440133
中图分类号
学科分类号
摘要
引用
收藏
页码:2030 / 2044
相关论文
共 50 条
  • [41] ABCD rule and pre-trained CNNs for melanoma diagnosis
    Moura, Nayara
    Veras, Rodrigo
    Aires, Kelson
    Machado, Vinicius
    Silva, Romuere
    Araujo, Flavio
    Claro, Maila
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (06) : 6869 - 6888
  • [42] BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models
    He, Bin
    Zhou, Di
    Xiao, Jinghui
    Jiang, Xin
    Liu, Qun
    Yuan, Nicholas Jing
    Xu, Tong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 2281 - 2290
  • [43] An empirical study of pre-trained language models in simple knowledge graph question answering
    Nan Hu
    Yike Wu
    Guilin Qi
    Dehai Min
    Jiaoyan Chen
    Jeff Z Pan
    Zafar Ali
    World Wide Web, 2023, 26 : 2855 - 2886
  • [44] KG-prompt: Interpretable knowledge graph prompt for pre-trained language models
    Chen, Liyi
    Liu, Jie
    Duan, Yutai
    Wang, Runze
    KNOWLEDGE-BASED SYSTEMS, 2025, 311
  • [45] Grounding Dialogue Systems via Knowledge Graph Aware Decoding with Pre-trained Transformers
    Chaudhuri, Debanjan
    Rony, Md Rashad Al Hasan
    Lehmann, Jens
    SEMANTIC WEB, ESWC 2021, 2021, 12731 : 323 - 339
  • [46] MULTILINGUAL TEXT CLASSIFIER USING PRE-TRAINED UNIVERSAL SENTENCE ENCODER MODEL
    Orlovskiy, O., V
    Sohrab, Khalili
    Ostapov, S. E.
    Hazdyuk, K. P.
    Shumylyak, L. M.
    RADIO ELECTRONICS COMPUTER SCIENCE CONTROL, 2022, (03) : 102 - 108
  • [47] Using Noise and External Knowledge to Enhance Chinese Pre-trained Model
    Ma, Haoyang
    Li, Zeyu
    Guo, Hongyu
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 476 - 480
  • [48] Acquiring Knowledge from Pre-Trained Model to Neural Machine Translation
    Weng, Rongxiang
    Yu, Heng
    Huang, Shujian
    Cheng, Shanbo
    Luo, Weihua
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9266 - 9273
  • [49] ABCD rule and pre-trained CNNs for melanoma diagnosis
    Nayara Moura
    Rodrigo Veras
    Kelson Aires
    Vinícius Machado
    Romuere Silva
    Flávio Araújo
    Maíla Claro
    Multimedia Tools and Applications, 2019, 78 : 6869 - 6888
  • [50] Universal embedding for pre-trained models and data bench
    Cho, Namkyeong
    Cho, Taewon
    Shin, Jaesun
    Jeon, Eunjoo
    Lee, Taehee
    NEUROCOMPUTING, 2025, 619