Simple Knowledge Graph Completion Model Based on Differential Negative Sampling and Prompt Learning

被引:0
|
作者
Duan, Li [1 ]
Wang, Jing [1 ]
Luo, Bing [1 ]
Sun, Qiao [1 ,2 ]
机构
[1] Naval Univ Engn, Coll Elect Engn, Wuhan 430033, Peoples R China
[2] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
关键词
natural language processing; knowledge graph completion; prompt learning; positive unlabeled learning;
D O I
10.3390/info14080450
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graphs (KGs) serve as a crucial resource for numerous artificial intelligence tasks, significantly contributing to the advancement of the AI field. However, the incompleteness of existing KGs hinders their effectiveness in practical applications. Consequently, researchers have proposed the task of KG completion. Currently, embedding-based techniques dominate the field as they leverage the structural information within KGs to infer and complete missing parts. Nonetheless, these methods exhibit limitations. They are limited by the quality and quantity of structural information and are unable to handle the missing entities in the original KG. To overcome these challenges, researchers have attempted to integrate pretrained language models and textual data to perform KG completion. This approach utilizes the definition statements and description text of entities within KGs. The goal is to compensate for the latent connections that are difficult for traditional methods to obtain. However, text-based methods still lag behind embedding-based models in terms of performance. Our analysis reveals that the critical issue lies in the selection process of negative samples. In order to enhance the performance of the text-based methods, various types of negative sampling methods are employed in this study. We introduced prompt learning to fill the gap between the pre-training language model and the knowledge graph completion task, and to improve the model reasoning level. Simultaneously, a ranking strategy based on KG structural information is proposed to utilize KG structured data to assist reasoning. The experiment results demonstrate that our model exhibits strong competitiveness and outstanding inference speed. By fully exploiting the internal structural information of KGs and external relevant descriptive text resources, we successfully elevate the performance levels of KG completion tasks across various metrics.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Inductive Learning on Commonsense Knowledge Graph Completion
    Wang, Bin
    Wang, Guangtao
    Huang, Jing
    You, Jiaxuan
    Leskovec, Jure
    Kuo, C-C Jay
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [32] KGBoost: A classification-based knowledge base completion method with negative sampling
    Wang, Yun-Cheng
    Ge, Xiou
    Wang, Bin
    Kuo, C-C Jay
    PATTERN RECOGNITION LETTERS, 2022, 157 : 104 - 111
  • [33] GRL: Knowledge graph completion with GAN-based reinforcement learning
    Wang, Qi
    Ji, Yuede
    Hao, Yongsheng
    Cao, Jie
    KNOWLEDGE-BASED SYSTEMS, 2020, 209
  • [34] Reinforced Negative Sampling for Knowledge Graph Embedding
    Xie, Yushun
    Wang, Haiyan
    Wang, Le
    Luo, Lei
    Lie, Jianxin
    Gu, Zhaoquan
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2024, PT IV, 2024, 14853 : 358 - 374
  • [35] Entity Similarity-Based Negative Sampling for Knowledge Graph Embedding
    Yao, Naimeng
    Liu, Qing
    Li, Xiang
    Yang, Yi
    Bai, Quan
    PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2022, 13630 : 73 - 87
  • [36] Research on Knowledge Graph Completion Based upon Knowledge Graph Embedding
    Feng, Tuoyu
    Wu, Yongsheng
    Li, Libing
    2024 9TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS, ICCCS 2024, 2024, : 1335 - 1342
  • [37] Graph-based deep learning model for knowledge base completion in constraint management of construction projects
    Wu, Chengke
    Li, Xiao
    Jiang, Rui
    Guo, Yuanjun
    Wang, Jun
    Yang, Zhile
    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2023, 38 (06) : 702 - 719
  • [38] A deep embedding model for knowledge graph completion based on attention mechanism
    Huang, Jin
    Zhang, TingHua
    Zhu, Jia
    Yu, Weihao
    Tang, Yong
    He, Yang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (15): : 9751 - 9760
  • [39] Knowledge graph completion model based on hyperbolic hierarchical attention network
    Luo, Jiaohuang
    Song, Changlong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (09) : 3893 - 3909
  • [40] A Dynamic Convolutional Network-Based Model for Knowledge Graph Completion
    Peng, Haoliang
    Wu, Yue
    INFORMATION, 2022, 13 (03)