Explainable reasoning over temporal knowledge graphs by pre-trained language model

被引:0
|
作者
Li, Qing [1 ]
Wu, Guanzhong [1 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian 710000, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge graph reasoning; Pre-trained language model; Logical reasoning; Temporal knowledge graph; Multi-hop paths; Graph representation learning;
D O I
10.1016/j.ipm.2024.103903
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Temporal knowledge graph reasoning (TKGR) has been considered as a crucial task for modeling the evolving knowledge, aiming to infer the unknown connections between entities at specific times. Traditional TKGR methods try to aggregate structural information between entities and evolve representations of entities over distinct snapshots, while some other methods attempt to extract temporal logic rules from historical interactions. However, these methods fail to address the continuously emerging unseen entities over time and ignore the historical dependencies between entities and relations. To overcome these limitations, we propose a novel method, termed TPNet, which introduces historical information completion strategy (HICS) and pre-trained language model (PLM) to conduct explainable inductive reasoning over TKGs. Specifically, TPNet extracts reliable temporal logical paths from historical subgraphs using a temporal-correlated search strategy. For unseen entities, we utilize HICS to sample or generate paths to supplement their historical information. Besides, a PLM and a time-aware encoder are introduced to jointly encode the temporal paths, thereby comprehensively capturing dependencies between entities and relations. Moreover, the semantic similarity between the query quadruples and the extracted paths is evaluated to simultaneously optimize the representations of entities and relations. Extensive experiments on entity and relation prediction tasks are conducted to evaluate the performance of TPNet. The experimental results on four benchmark datasets demonstrate the superiority of TPNet over state-of-the-art TKGR methods, achieving improvements of 14.35%, 23.08%, 6.75% and 5.38% on MRR, respectively.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Knowledge graph extension with a pre-trained language model via unified learning method
    Choi, Bonggeun
    Ko, Youngjoong
    KNOWLEDGE-BASED SYSTEMS, 2023, 262
  • [42] Misspelling Correction with Pre-trained Contextual Language Model
    Hu, Yifei
    Ting, Xiaonan
    Ko, Youlim
    Rayz, Julia Taylor
    PROCEEDINGS OF 2020 IEEE 19TH INTERNATIONAL CONFERENCE ON COGNITIVE INFORMATICS & COGNITIVE COMPUTING (ICCI*CC 2020), 2020, : 144 - 149
  • [43] A Pre-trained Knowledge Tracing Model with Limited Data
    Yue, Wenli
    Su, Wei
    Liu, Lei
    Cai, Chuan
    Yuan, Yongna
    Jia, Zhongfeng
    Liu, Jiamin
    Xie, Wenjian
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, PT I, DEXA 2024, 2024, 14910 : 163 - 178
  • [44] CLIP-Llama: A New Approach for Scene Text Recognition with a Pre-Trained Vision-Language Model and a Pre-Trained Language Model
    Zhao, Xiaoqing
    Xu, Miaomiao
    Silamu, Wushour
    Li, Yanbing
    SENSORS, 2024, 24 (22)
  • [45] Explainable Pre-Trained Language Models for Sentiment Analysis in Low-Resourced Languages
    Mabokela, Koena Ronny
    Primus, Mpho
    Celik, Turgay
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (11)
  • [46] Data-Centric Explainable Debiasing for Improving Fairness in Pre-trained Language Models
    Li, Yingji
    Du, Mengnan
    Song, Rui
    Wang, Xin
    Wang, Ying
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 3773 - 3786
  • [47] Enhancing Language Generation with Effective Checkpoints of Pre-trained Language Model
    Park, Jeonghyeok
    Zhao, Hai
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2686 - 2694
  • [48] MEM-KGC: Masked Entity Model for Knowledge Graph Completion With Pre-Trained Language Model
    Choi, Bonggeun
    Jang, Daesik
    Ko, Youngjoong
    IEEE ACCESS, 2021, 9 : 132025 - 132032
  • [49] Knowledge Aware Conversation Generation with Explainable Reasoning over Augmented Graphs
    Liu, Zhibin
    Niu, Zheng-Yu
    Wu, Hua
    Wang, Haifeng
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1782 - 1792
  • [50] Pre-trained language models with domain knowledge for biomedical extractive summarization
    Xie Q.
    Bishop J.A.
    Tiwari P.
    Ananiadou S.
    Knowledge-Based Systems, 2022, 252