Simple Knowledge Graph Completion Model Based on Differential Negative Sampling and Prompt Learning

被引:0
|
作者
Duan, Li [1 ]
Wang, Jing [1 ]
Luo, Bing [1 ]
Sun, Qiao [1 ,2 ]
机构
[1] Naval Univ Engn, Coll Elect Engn, Wuhan 430033, Peoples R China
[2] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
关键词
natural language processing; knowledge graph completion; prompt learning; positive unlabeled learning;
D O I
10.3390/info14080450
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graphs (KGs) serve as a crucial resource for numerous artificial intelligence tasks, significantly contributing to the advancement of the AI field. However, the incompleteness of existing KGs hinders their effectiveness in practical applications. Consequently, researchers have proposed the task of KG completion. Currently, embedding-based techniques dominate the field as they leverage the structural information within KGs to infer and complete missing parts. Nonetheless, these methods exhibit limitations. They are limited by the quality and quantity of structural information and are unable to handle the missing entities in the original KG. To overcome these challenges, researchers have attempted to integrate pretrained language models and textual data to perform KG completion. This approach utilizes the definition statements and description text of entities within KGs. The goal is to compensate for the latent connections that are difficult for traditional methods to obtain. However, text-based methods still lag behind embedding-based models in terms of performance. Our analysis reveals that the critical issue lies in the selection process of negative samples. In order to enhance the performance of the text-based methods, various types of negative sampling methods are employed in this study. We introduced prompt learning to fill the gap between the pre-training language model and the knowledge graph completion task, and to improve the model reasoning level. Simultaneously, a ranking strategy based on KG structural information is proposed to utilize KG structured data to assist reasoning. The experiment results demonstrate that our model exhibits strong competitiveness and outstanding inference speed. By fully exploiting the internal structural information of KGs and external relevant descriptive text resources, we successfully elevate the performance levels of KG completion tasks across various metrics.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] A deep embedding model for knowledge graph completion based on attention mechanism
    Jin Huang
    TingHua Zhang
    Jia Zhu
    Weihao Yu
    Yong Tang
    Yang He
    Neural Computing and Applications, 2021, 33 : 9751 - 9760
  • [42] A path-based relation networks model for knowledge graph completion
    Lee, Wan-Kon
    Shin, Won-Chul
    Jagvaral, Batselem
    Roh, Jae-Seung
    Kim, Min-Sung
    Lee, Min-Ho
    Park, Hyun-Kyu
    Park, Young-Tack
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 182
  • [43] Instance type completion in equipment knowledge graph based on translation model
    Miao, Lin
    Wu, Yu
    Yang, Qimin
    INTERNATIONAL JOURNAL OF MODELING SIMULATION AND SCIENTIFIC COMPUTING, 2024, 15 (05)
  • [44] Graph Pattern Entity Ranking Model for Knowledge Graph Completion
    Ebisu, Takuma
    Ichise, Ryutaro
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 988 - 997
  • [45] Improving Knowledge Graph Completion with Generative Hard Negative Mining
    Qiao, Zile
    Ye, Wei
    Yu, Dingyao
    Mo, Tong
    Li, Weiping
    Zhang, Shikun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5866 - 5878
  • [46] Joint learning based on multi-shaped filters for knowledge graph completion
    李少杰
    Chen Shudong
    Ouyang Xiaoye
    Gong Lichen
    High Technology Letters, 2021, 27 (01) : 43 - 52
  • [47] A Cybersecurity Knowledge Graph Completion Method Based on Ensemble Learning and Adversarial Training
    Wang, Peng
    Liu, Jingju
    Hou, Dongdong
    Zhou, Shicheng
    APPLIED SCIENCES-BASEL, 2022, 12 (24):
  • [48] Joint learning based on multi-shaped filters for knowledge graph completion
    Li S.
    Chen S.
    Ouyang X.
    Gong L.
    High Technology Letters, 2021, 27 (01) : 43 - 52
  • [49] Temporal knowledge graph completion based on product space and contrastive learning of commonsense
    Chen, Zhenghao
    Wu, Jianbin
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2025,
  • [50] One-shot knowledge graph completion based on disentangled representation learning
    Zhang, Youmin
    Sun, Lei
    Wang, Ye
    Liu, Qun
    Liu, Li
    Neural Computing and Applications, 2024, 36 (32) : 20277 - 20293