A review of graph neural networks and pretrained language models for knowledge graph reasoning

被引:1
|
作者
Ma, Jiangtao [1 ,2 ]
Liu, Bo [2 ]
Li, Kunlin [2 ]
Li, Chenliang [3 ]
Zhang, Fan [4 ]
Luo, Xiangyang [5 ]
Qiao, Yaqiong [5 ,6 ]
机构
[1] Tianjin Normal Univ, Coll Comp & Informat Engn, Tianjin 300387, Peoples R China
[2] Zhengzhou Univ Light Ind, Coll Comp Sci & Technol, Zhengzhou 450000, Peoples R China
[3] Wuhan Univ, Sch Cyber Sci & Engn, Wuhan 430079, Peoples R China
[4] Natl Digital Switching Syst Engn & Technol R&D Ctr, Zhengzhou 450001, Peoples R China
[5] State Key Lab Math Engn & Adv Comp, Zhengzhou 450001, Peoples R China
[6] Nankai Univ, Coll Cyber Sci, Tianjin 300350, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge graph reasoning; Graph neural networks; Pretrained language models; Logic rules;
D O I
10.1016/j.neucom.2024.128490
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge Graph (KG) stores human knowledge facts in an intuitive graphical structure but faces challenges such as incomplete construction or inability to handle new knowledge. Knowledge Graph Reasoning (KGR) can make KGs more accurate, complete, and trustworthy to support various artificial intelligence applications better. Currently, the popular KGR methods are based on graph neural networks (GNNs). Recent studies have shown that hybrid logic rules and synergized pre-trained language models (PLMs) can enhance the GNN-based KGR methods. These methods mainly focus on data sparsity, insufficient knowledge evolution patterns, multi- modal fusion, and few-shot reasoning. Although many studies have been conducted, there are still few review papers that comprehensively summarize and explore KGR methods related to GNNs, logic rules, and PLMs. Therefore, this paper provides a comprehensive review of GNNs and PLMs for KGR based on a large number of high-quality papers. To present a clear overview of KGR, we propose a general framework. Specifically, we first introduce the KG preparation. Then we provide an overview of KGR methods, in which we categorize KGR methods into GNNs-based, logic rules-enhanced, and pre-trained language models-enhanced KGR methods. Furthermore, we also compare and analyze the GNN-based KGR methods in two scenarios. Moreover, we also present the application of KGR in different fields. Finally, we discuss the current challenges and future research directions for KGR.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Knowledge Distillation Improves Graph Structure Augmentation for Graph Neural Networks
    Wu, Lirong
    Lin, Haitao
    Huang, Yufei
    Li, Stan Z.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [22] Dual view graph transformer networks for multi-hop knowledge graph reasoning
    Sun, Congcong
    Chen, Jianrui
    Shao, Zhongshi
    Huang, Junjie
    NEURAL NETWORKS, 2025, 186
  • [23] Relgraph: A Multi-Relational Graph Neural Network Framework for Knowledge Graph Reasoning Based on Relation Graph
    Tian, Xin
    Meng, Yuan
    APPLIED SCIENCES-BASEL, 2024, 14 (07):
  • [24] Dynamic Reasoning with Language Model and Knowledge Graph for Question Answering
    Lu, Yujie
    Wu, Dean
    Zhang, Yuhong
    DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT IV, 2024, 14807 : 441 - 455
  • [25] Graph Neural Prompting with Large Language Models
    Tian, Yijun
    Song, Huan
    Wang, Zichen
    Wang, Haozhu
    Hu, Ziqing
    Wang, Fang
    Chawla, Nitesh V.
    Xu, Panpan
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17, 2024, : 19080 - 19088
  • [26] On Representation Knowledge Distillation for Graph Neural Networks
    Joshi, Chaitanya K.
    Liu, Fayao
    Xun, Xu
    Lin, Jie
    Foo, Chuan Sheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4656 - 4667
  • [27] Distilling Holistic Knowledge with Graph Neural Networks
    Zhou, Sheng
    Wang, Yucheng
    Chen, Defang
    Chen, Jiawei
    Wang, Xin
    Wang, Can
    Bu, Jiajun
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10367 - 10376
  • [28] Overview of knowledge reasoning for knowledge graph
    Liu, Xinliang
    Mao, Tingyu
    Shi, Yanyan
    Ren, Yanzhao
    NEUROCOMPUTING, 2024, 585
  • [29] MindMap: Knowledge Graph Prompting Sparks Graph of Thoughts in Large Language Models
    Wen, Yilin
    Wang, Zifeng
    Sun, Jimeng
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 10370 - 10388
  • [30] Leveraging Non-Parametric Reasoning With Large Language Models for Enhanced Knowledge Graph Completion
    Zhang, Ying
    Shen, Yangpeng
    Xiao, Gang
    Peng, Jinghui
    IEEE ACCESS, 2024, 12 : 177012 - 177027