Integration of Semantic and Topological Structural Similarity Comparison for Entity Alignment without Pre-Training

被引:0
|
作者
Liu, Yao [1 ]
Liu, Ye [2 ]
机构
[1] Beihang Univ, Sch Comp Sci & Technol, Beijing 100191, Peoples R China
[2] Beihang Univ, Sch Instrumentat & Optoelect Engn, Beijing 100191, Peoples R China
关键词
entity alignment; knowledge graph; description information; topological structure;
D O I
10.3390/electronics13112036
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Entity alignment (EA) is a critical task in integrating diverse knowledge graph (KG) data and plays a central role in data-driven AI applications. Traditional EA approaches rely on entity embeddings, but their effectiveness is limited by scarce KG input data and representation learning techniques. Large language models have shown promise, but face challenges such as high hardware requirements, large model sizes and computational inefficiency, which limit their applicability. To overcome these limitations, we propose an entity-alignment model that compares the similarity between entities by capturing both semantic and topological information to enable the alignment of entities with high similarity. First, we analyze descriptive information to quantify semantic similarity, including individual features such as types and attributes. Then, for topological analysis, we introduce four conditions based on graph connectivity and structural patterns to determine subgraph similarity within three hops of the entity's neighborhood, thereby improving accuracy. Finally, we integrate semantic and topological similarity using a weighted approach that considers dataset features. Our model requires no pre-training and is designed to be compact and generalizable to different datasets. Experimental results on four standard EA datasets validate the effectiveness of our proposed model.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] A Semantic Textual Similarity Calculation Model Based on Pre-training Model
    Ding, Zhaoyun
    Liu, Kai
    Wang, Wenhao
    Liu, Bin
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2021, PT II, 2021, 12816 : 3 - 15
  • [2] STRUCTURAL ALIGNMENT IN COMPARISON - NO DIFFERENCE WITHOUT SIMILARITY
    GENTNER, D
    MARKMAN, AB
    PSYCHOLOGICAL SCIENCE, 1994, 5 (03) : 152 - 158
  • [3] EntityLayout: Entity-Level Pre-training Language Model for Semantic Entity Recognition and Relation Extraction
    Xu, Chun-Bo
    Chen, Yi-Ming
    Liu, Cheng-Lin
    DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT I, 2024, 14804 : 262 - 279
  • [4] Cross-modal Semantic Alignment Pre-training for Vision-and-Language Navigation
    Wu, Siying
    Fu, Xueyang
    Wu, Feng
    Zha, Zheng-Jun
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 4233 - 4241
  • [5] Contrastive Pre-training with Multi-level Alignment for Grounded Multimodal Named Entity Recognition
    Bao, Xigang
    Tian, Mengyuan
    Wang, Luyao
    Zha, Zhiyuan
    Qin, Biao
    PROCEEDINGS OF THE 4TH ANNUAL ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2024, 2024, : 795 - 803
  • [6] Pre-Training Without Natural Images
    Kataoka, Hirokatsu
    Okayasu, Kazushige
    Matsumoto, Asato
    Yamagata, Eisuke
    Yamada, Ryosuke
    Inoue, Nakamasa
    Nakamura, Akio
    Satoh, Yutaka
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2022, 130 (04) : 990 - 1007
  • [7] Pre-Training Without Natural Images
    Hirokatsu Kataoka
    Kazushige Okayasu
    Asato Matsumoto
    Eisuke Yamagata
    Ryosuke Yamada
    Nakamasa Inoue
    Akio Nakamura
    Yutaka Satoh
    International Journal of Computer Vision, 2022, 130 : 990 - 1007
  • [8] Entity Enhanced BERT Pre-training for Chinese NER
    Jia, Chen
    Shi, Yuefeng
    Yang, Qinrong
    Zhang, Yue
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 6384 - 6396
  • [9] Towards Generalizable Semantic Product Search by Text Similarity Pre-training on Search Click Logs
    Liu, Zheng
    Zhang, Wei
    Chen, Yan
    Sun, Weiyi
    Du, Michael
    Schroeder, Benjamin
    PROCEEDINGS OF THE 5TH WORKSHOP ON E-COMMERCE AND NLP (ECNLP 5), 2022, : 224 - 233
  • [10] Structural Pre-training for Dialogue Comprehension
    Zhang, Zhuosheng
    Zhao, Hai
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 5134 - 5145