EARR: Using rules to enhance the embedding of knowledge graph

被引:5
|
作者
Li, Jin [1 ]
Xiang, Jinpeng [1 ]
Cheng, Jianhua [2 ]
机构
[1] Harbin Engn Univ, Coll Comp Sci & Technol, Harbin, Peoples R China
[2] Harbin Engn Univ, Coll Intelligent Syst Sci & Engn, Harbin, Peoples R China
关键词
Knowledge graph; Knowledge graph embedding; Rule extraction; Rule enhanced knowledge graph embedding;
D O I
10.1016/j.eswa.2023.120831
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge graphs have been receiving increasing attention from researchers. However, most of these graphs are incomplete, leading to the rise of knowledge graph completion as a prominent task. The goal of knowledge graph completion is to find missing relations in a knowledge graph. Knowledge graph embedding represents the entities and relations in a low-dimensional embedding space, simplifying operations and allowing for integration with knowledge graph completion tasks. Several popular embedding models, such as TransE, TransH, TransR, TuckER, RotatE, and others have achieved impressive results on knowledge graph completion tasks. However, most of these methods do not incorporate background knowledge that could enhance the quality of knowledge embedding. Logic rules are adaptable and scalable, which can enrich background knowledge, and separating the attributes of entities can improve the relevance of relations and facilitate the accuracy of logic rule extraction. Thus, we propose a novel method, named Entity-Attribute-Relation-Rule (EARR), which separates attributes from entities and uses logic rules to extend the dataset, improving the accuracy of knowledge graph completion tasks. We define a total of six rules in this paper, including Rule 1-3, Rule 5, and Rule 6 for entities, and Rule 4 for entities and attributes. We evaluate our method based on the task of link prediction through two kinds of experiments. In the basic experiment, we compare our method with three benchmark models, namely, TransE, TransH, and TransR. In the experiment with different size datasets, FB24K and CoDEx, we evaluate our method on different size datasets with different models, including TransE, TuckER, and RotatE. The experimental results indicate that EARR can improve the quality of knowledge graph embedding.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Knowledge Graph Embedding by Dynamic Translation
    Chang, Liang
    Zhu, Manli
    Gu, Tianlong
    Bin, Chenzhong
    Qian, Junyan
    Zhang, Ji
    IEEE ACCESS, 2017, 5 : 20898 - 20907
  • [32] Domain Representation for Knowledge Graph Embedding
    Wang, Cunxiang
    Ren, Feiliang
    Lin, Zhichao
    Zhao, Chenxu
    Xie, Tian
    Zhang, Yue
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 197 - 210
  • [33] Knowledge graph embedding by Bias Vectors
    Ding, Minjie
    Tong, Weiqin
    Ding, Xuehai
    Zhi, Xiaoli
    Wang, Xiao
    Zhang, Guoqing
    2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019), 2019, : 1296 - 1302
  • [34] Knowledge Graph Embedding by Translating on Hyperplanes
    Wang, Zhen
    Zhang, Jianwen
    Feng, Jianlin
    Chen, Zheng
    PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 1112 - 1119
  • [35] Knowledge Graph Embedding with Triple Context
    Shi, Jun
    Gao, Huan
    Qi, Guilin
    Zhou, Zhangquan
    CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 2299 - 2302
  • [36] Online Updates of Knowledge Graph Embedding
    Fei, Luo
    Wu, Tianxing
    Khan, Arijit
    COMPLEX NETWORKS & THEIR APPLICATIONS X, VOL 2, 2022, 1016 : 523 - 535
  • [37] Knowledge graph embedding by reflection transformation
    Zhang, Qianjin
    Wang, Ronggui
    Yang, Juan
    Xue, Lixia
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [38] Semantically Smooth Knowledge Graph Embedding
    Guo, Shu
    Wang, Quan
    Wang, Bin
    Wang, Lihong
    Guo, Li
    PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, 2015, : 84 - 94
  • [39] Discovering Causal Rules in Knowledge Graphs using Graph Embeddings
    Simonne, Lucas
    Pemelle, Nathalic
    Sais, Fatiha
    Thomopoulos, Rallou
    2022 IEEE/WIC/ACM INTERNATIONAL JOINT CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY, WI-IAT, 2022, : 95 - 102
  • [40] Using Graph Pattern Association Rules on Yago Knowledge Base
    Wahyudi
    Khodra, Masayu Leylia
    Prihatmanto, Ary Setijadi
    Machbub, Carmadi
    JOURNAL OF ICT RESEARCH AND APPLICATIONS, 2019, 13 (02) : 162 - 175