CIST: Differentiating Concepts and Instances Based on Spatial Transformation for Knowledge Graph Embedding

被引:5
|
作者
Zhang, Pengfei [1 ]
Chen, Dong [1 ]
Fang, Yang [1 ]
Zhao, Xiang [2 ]
Xiao, Weidong [1 ]
机构
[1] Natl Univ Def Technol, Sci & Technol Informat Syst Engn Lab, Changsha 410073, Peoples R China
[2] Natl Univ Def Technol, Lab Big Data & Decis, Changsha 410073, Peoples R China
关键词
knowledge graph; knowledge graph embedding; concepts and instances;
D O I
10.3390/math10173161
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Knowledge representation learning is representing entities and relations in a knowledge graph as dense low-dimensional vectors in the continuous space, which explores the features and properties of the graph. Such a technique can facilitate the computation and reasoning on the knowledge graphs, which benefits many downstream tasks. In order to alleviate the problem of insufficient entity representation learning caused by sparse knowledge graphs, some researchers propose knowledge graph embedding models based on instances and concepts, which utilize the latent semantic connections between concepts and instances contained in the knowledge graphs to enhance the knowledge graph embedding. However, they model instances and concepts in the same space or ignore the transitivity of isA relations, leading to inaccurate embeddings of concepts and instances. To address the above shortcomings, we propose a knowledge graph embedding model that differentiates concepts and instances based on spatial transformation-CIST. The model alleviates the gathering issue of similar instances or concepts in the semantic space by modeling them in different embedding spaces, and adds a learnable parameter to adjust the neighboring range for concept embedding to distinguish hierarchical information of different concepts, thus modeling the transitivity of isA relations. The above features of instances and concepts serve as auxiliary information so that thoroughly modeling them could alleviate the insufficient entity representation learning issue. For the experiments, we chose two tasks, i.e., link prediction and triple classification, and two real-life datasets: YAGO26K-906 and DB111K-174. Compared with state of the arts, CIST achieves an optimal performance in most cases. Specifically, CIST outperforms the SOTA model JOIE by 51.1% on Hits@1 in link prediction and 15.2% on F1 score in triple classification.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Differentiating Concepts and Instances for Knowledge Graph Embedding
    Lv, Xin
    Hou, Lei
    Li, Juanzi
    Liu, Zhiyuan
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 1971 - 1979
  • [2] JECI: A Joint Knowledge Graph Embedding Model for Concepts and Instances
    Zhou, Jing
    Wang, Peng
    Pan, Zhe
    Xu, Zhongkai
    SEMANTIC TECHNOLOGY, JIST 2019: PROCEEDINGS, 2020, 12032 : 82 - 98
  • [3] JECI++: A Modified Joint Knowledge Graph Embedding Model for Concepts and Instances
    Wang, Peng
    Zhou, Jing
    BIG DATA RESEARCH, 2021, 24
  • [4] Knowledge graph embedding with concepts
    Guan, Niannian
    Song, Dandan
    Liao, Lejian
    KNOWLEDGE-BASED SYSTEMS, 2019, 164 : 38 - 44
  • [5] Affine Transformation-Based Knowledge Graph Embedding
    Jiang, Jiahao
    Pu, Fei
    Cui, Jie
    Yang, Bailin
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, KSEM 2024, 2024, 14884 : 284 - 297
  • [6] Knowledge graph embedding by reflection transformation
    Zhang, Qianjin
    Wang, Ronggui
    Yang, Juan
    Xue, Lixia
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [7] LorenTzE: Temporal Knowledge Graph Embedding Based on Lorentz Transformation
    Li, Ningyuan
    Haihong, E.
    Shi, Li
    Lin, Xueyuan
    Song, Meina
    Li, Yuhan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VI, 2023, 14259 : 472 - 484
  • [8] Embedding Hierarchical Tree Structure of Concepts in Knowledge Graph Embedding
    Yu, Jibin
    Zhang, Chunhong
    Hu, Zheng
    Ji, Yang
    ELECTRONICS, 2024, 13 (22)
  • [9] Knowledge Graph Embedding Based on Semantic Hierarchical Spatial Rotation
    Yin, Liangcheng
    Zhu, Jie
    Hou, Enshuai
    Ni, Ma
    2021 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2021, : 19 - 24
  • [10] Knowledge Graph Embedding Based on Quaternion Transformation and Convolutional Neural Network
    Gao, Yabin
    Tian, Xiaoyun
    Zhou, Jing
    Zheng, Bin
    Li, Hairu
    Zhu, Zizhong
    ADVANCED DATA MINING AND APPLICATIONS, ADMA 2021, PT II, 2022, 13088 : 128 - 136