Explainable reasoning over temporal knowledge graphs by pre-trained language model

被引:0
|
作者
Li, Qing [1 ]
Wu, Guanzhong [1 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian 710000, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge graph reasoning; Pre-trained language model; Logical reasoning; Temporal knowledge graph; Multi-hop paths; Graph representation learning;
D O I
10.1016/j.ipm.2024.103903
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Temporal knowledge graph reasoning (TKGR) has been considered as a crucial task for modeling the evolving knowledge, aiming to infer the unknown connections between entities at specific times. Traditional TKGR methods try to aggregate structural information between entities and evolve representations of entities over distinct snapshots, while some other methods attempt to extract temporal logic rules from historical interactions. However, these methods fail to address the continuously emerging unseen entities over time and ignore the historical dependencies between entities and relations. To overcome these limitations, we propose a novel method, termed TPNet, which introduces historical information completion strategy (HICS) and pre-trained language model (PLM) to conduct explainable inductive reasoning over TKGs. Specifically, TPNet extracts reliable temporal logical paths from historical subgraphs using a temporal-correlated search strategy. For unseen entities, we utilize HICS to sample or generate paths to supplement their historical information. Besides, a PLM and a time-aware encoder are introduced to jointly encode the temporal paths, thereby comprehensively capturing dependencies between entities and relations. Moreover, the semantic similarity between the query quadruples and the extracted paths is evaluated to simultaneously optimize the representations of entities and relations. Extensive experiments on entity and relation prediction tasks are conducted to evaluate the performance of TPNet. The experimental results on four benchmark datasets demonstrate the superiority of TPNet over state-of-the-art TKGR methods, achieving improvements of 14.35%, 23.08%, 6.75% and 5.38% on MRR, respectively.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] ReLMKG: reasoning with pre-trained language models and knowledge graphs for complex question answering
    Xing Cao
    Yun Liu
    Applied Intelligence, 2023, 53 : 12032 - 12046
  • [2] ReLMKG: reasoning with pre-trained language models and knowledge graphs for complex question answering
    Cao, Xing
    Liu, Yun
    APPLIED INTELLIGENCE, 2023, 53 (10) : 12032 - 12046
  • [3] Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion
    Xu, Wenjie
    Liu, Ben
    Peng, Miao
    Jia, Xu
    Peng, Min
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 7790 - 7803
  • [4] Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion
    Xu, Wenjie
    Liu, Ben
    Peng, Miao
    Jia, Xu
    Peng, Min
    arXiv, 2023,
  • [5] Commonsense Knowledge Reasoning and Generation with Pre-trained Language Models: A Survey
    Bhargava, Prajjwal
    Ng, Vincent
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 12317 - 12325
  • [6] Knowledge Enhanced Pre-trained Language Model for Product Summarization
    Yin, Wenbo
    Ren, Junxiang
    Wu, Yuejiao
    Song, Ruilin
    Liu, Lang
    Cheng, Zhen
    Wang, Sibo
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT II, 2022, 13552 : 263 - 273
  • [7] Explainable Reasoning over Knowledge Graphs for Recommendation
    Wang, Xiang
    Wang, Dingxian
    Xu, Canran
    He, Xiangnan
    Cao, Yixin
    Chua, Tat-Seng
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 5329 - 5336
  • [8] Leveraging Pre-trained Language Models for Time Interval Prediction in Text-Enhanced Temporal Knowledge Graphs
    Islakoglu, Duygu Sezen
    Chekol, Melisachew Wudage
    Velegrakis, Yannis
    SEMANTIC WEB, PT I, ESWC 2024, 2024, 14664 : 59 - 78
  • [9] ReasoningLM: Enabling Structural Subgraph Reasoning in Pre-trained Language Models for Question Answering over Knowledge Graph
    Jiang, Jinhao
    Zhou, Kun
    Zhao, Wayne Xin
    Li, Yaliang
    Wen, Ji-Rong
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 3721 - 3735
  • [10] Knowledge Rumination for Pre-trained Language Models
    Yao, Yunzhi
    Wang, Peng
    Mao, Shengyu
    Tan, Chuanqi
    Huang, Fei
    Chen, Huajun
    Zhang, Ningyu
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 3387 - 3404