JointLK: Joint Reasoning with Language Models and Knowledge Graphs for Commonsense Question Answering

被引:0
|
作者
Sun, Yueqing [1 ]
Shi, Qi [1 ]
Qi, Le [1 ]
Zhang, Yu [1 ]
机构
[1] Harbin Inst Technol, Res Ctr Social Comp & Informat Retrieval, Harbin, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing KG-augmented models for common-sense question answering primarily focus on designing elaborate Graph Neural Networks (GNNs) to model knowledge graphs (KGs). However, they ignore (i) the effectively fusing and reasoning over question context representations and the KG representations, and (ii) automatically selecting relevant nodes from the noisy KGs during reasoning. In this paper, we propose a novel model, JointLK, which solves the above limitations through the joint reasoning of LM and GNN and the dynamic KGs pruning mechanism. Specifically, JointLK performs joint reasoning between LM and GNN through a novel dense bidirectional attention module, in which each question token attends on KG nodes and each KG node attends on question tokens, and the two modal representations fuse and update mutually by multi-step interactions. Then, the dynamic pruning module uses the attention weights generated by joint reasoning to prune irrelevant KG nodes recursively. We evaluate JointLK on the CommonsenseQA and OpenBookQA datasets, and demonstrate its improvements to the existing LM and LM+KG models, as well as its capability to perform interpretable reasoning(1).
引用
收藏
页码:5049 / 5060
页数:12
相关论文
共 50 条
  • [1] QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question Answering
    Yasunaga, Michihiro
    Ren, Hongyu
    Bosselut, Antoine
    Liang, Percy
    Leskovec, Jure
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 535 - 546
  • [2] Dynamic Heterogeneous-Graph Reasoning with Language Models and Knowledge Representation Learning for Commonsense Question Answering
    Wang, Yujie
    Zhang, Hu
    Liang, Jiye
    Li, Ru
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 14048 - 14063
  • [3] ReLMKG: reasoning with pre-trained language models and knowledge graphs for complex question answering
    Xing Cao
    Yun Liu
    Applied Intelligence, 2023, 53 : 12032 - 12046
  • [4] ReLMKG: reasoning with pre-trained language models and knowledge graphs for complex question answering
    Cao, Xing
    Liu, Yun
    APPLIED INTELLIGENCE, 2023, 53 (10) : 12032 - 12046
  • [5] Incorporating Domain Knowledge and Semantic Information into Language Models for Commonsense Question Answering
    Zhou, Ruiying
    Tian, Keke
    Lai, Hanjiang
    Yin, Jian
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 1160 - 1165
  • [6] Efficient Question Answering Based on Language Models and Knowledge Graphs
    Li, Fengying
    Huang, Hongfei
    Dong, Rongsheng
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 340 - 351
  • [7] Meta-path reasoning of knowledge graph for commonsense question answering
    Zhang, Miao
    He, Tingting
    Dong, Ming
    FRONTIERS OF COMPUTER SCIENCE, 2024, 18 (01)
  • [8] Retrieval-Augmented Knowledge Graph Reasoning for Commonsense Question Answering
    Sha, Yuchen
    Feng, Yujian
    He, Miao
    Liu, Shangdong
    Ji, Yimu
    MATHEMATICS, 2023, 11 (15)
  • [9] Language-based reasoning graph neural network for commonsense question answering
    Yang, Meng
    Wang, Yihao
    Gu, Yu
    NEURAL NETWORKS, 2025, 181
  • [10] A medical question answering system using large language models and knowledge graphs
    Guo, Quan
    Cao, Shuai
    Yi, Zhang
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (11) : 8548 - 8564