Learning Context-based Embeddings for Knowledge Graph Completion

被引:0
|
作者
Fei Pu
Zhongwei Zhang
Yan Feng
Bailin Yang
机构
[1] SchoolofComputerandInformationEngineeringZhejiangGongshangUniversity
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论]; O157.5 [图论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Purpose: Due to the incompleteness nature of knowledge graphs(KGs), the task of predicting missing links between entities becomes important. Many previous approaches are static, this posed a notable problem that all meanings of a polysemous entity share one embedding vector. This study aims to propose a polysemous embedding approach, named KG embedding under relational contexts(Cont E for short), for missing link prediction. Design/methodology/approach: Cont E models and infers different relationship patterns by considering the context of the relationship, which is implicit in the local neighborhood of the relationship. The forward and backward impacts of the relationship in Cont E are mapped to two different embedding vectors, which represent the contextual information of the relationship. Then, according to the position of the entity, the entity's polysemous representation is obtained by adding its static embedding vector to the corresponding context vector of the relationship. Findings: Cont E is a fully expressive, that is, given any ground truth over the triples, there are embedding assignments to entities and relations that can precisely separate the true triples from false ones. Cont E is capable of modeling four connectivity patterns such as symmetry, antisymmetry, inversion and composition. Research limitations: Cont E needs to do a grid search to find best parameters to get best performance in practice, which is a time-consuming task. Sometimes, it requires longer entity vectors to get better performance than some other models.Practical implications: Cont E is a bilinear model, which is a quite simple model that could be applied to large-scale KGs. By considering contexts of relations, Cont E can distinguish the exact meaning of an entity in different triples so that when performing compositional reasoning, it is capable to infer the connectivity patterns of relations and achieves good performance on link prediction tasks.Originality/value: Cont E considers the contexts of entities in terms of their positions in triples and the relationships they link to. It decomposes a relation vector into two vectors, namely, forward impact vector and backward impact vector in order to capture the relational contexts. Cont E has the same low computational complexity as Trans E. Therefore, it provides a new approach for contextualized knowledge graph embedding.
引用
收藏
页码:84 / 106
页数:23
相关论文
共 50 条
  • [41] Inductive Learning on Commonsense Knowledge Graph Completion
    Wang, Bin
    Wang, Guangtao
    Huang, Jing
    You, Jiaxuan
    Leskovec, Jure
    Kuo, C-C Jay
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [42] GRL: Knowledge graph completion with GAN-based reinforcement learning
    Wang, Qi
    Ji, Yuede
    Hao, Yongsheng
    Cao, Jie
    KNOWLEDGE-BASED SYSTEMS, 2020, 209
  • [43] KICGPT: Large Language Model with Knowledge in Context for Knowledge Graph Completion
    Wei, Yanbin
    Huang, Qiushi
    Zhang, Yu
    Kwok, James T.
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 8667 - 8683
  • [44] Entity-Context and Relation-Context Combined Knowledge Graph Embeddings
    Wu, Yong
    Li, Wei
    Fan, Xiaoming
    Wang, Binjun
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2022, 47 (02) : 1471 - 1482
  • [45] Entity-Context and Relation-Context Combined Knowledge Graph Embeddings
    Yong Wu
    Wei Li
    Xiaoming Fan
    Binjun Wang
    Arabian Journal for Science and Engineering, 2022, 47 : 1471 - 1482
  • [46] Iteratively Learning Embeddings and Rules for Knowledge Graph Reasoning
    Zhang, Wen
    Paudel, Bibek
    Wang, Liang
    Chen, Jiaoyan
    Zhu, Hai
    Zhang, Wei
    Bernstein, Abraham
    Chen, Huajun
    WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, : 2366 - 2377
  • [47] Learning in context: enhancing machine learning with context-based reasoning
    Stein, Gary
    Gonzalez, Avelino J.
    APPLIED INTELLIGENCE, 2014, 41 (03) : 709 - 724
  • [48] Learning in context: enhancing machine learning with context-based reasoning
    Gary Stein
    Avelino J. Gonzalez
    Applied Intelligence, 2014, 41 : 709 - 724
  • [49] Fast and Accurate Learning of Knowledge Graph Embeddings at Scale
    Gupta, Udit
    Vadhiyar, Sathish
    2019 IEEE 26TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING, DATA, AND ANALYTICS (HIPC), 2019, : 173 - 182
  • [50] Learning Knowledge Graph Embeddings via Generalized Hyperplanes
    Zhu, Qiannan
    Zhou, Xiaofei
    Tan, JianLong
    Liu, Ping
    Guo, Li
    COMPUTATIONAL SCIENCE - ICCS 2018, PT I, 2018, 10860 : 624 - 638