Interactive Contrastive Learning for Self-Supervised Entity Alignment

被引:16
|
作者
Zeng, Kaisheng [1 ]
Dong, Zhenhao [2 ]
Hou, Lei [3 ]
Cao, Yixin [4 ]
Hu, Minghao [5 ]
Yu, Jifan [1 ]
Lv, Xin [1 ]
Cao, Lei [1 ]
Wang, Xin [1 ]
Liu, Haozhuang [1 ]
Huang, Yi [6 ]
Feng, Junlan [6 ]
Wan, Jing [2 ]
Li, Juanzi [7 ]
Feng, Ling [7 ]
机构
[1] Tsinghua Univ, Beijing, Peoples R China
[2] Beijing Univ Chem Technol, Beijing, Peoples R China
[3] Tsinghua, BNRist, Dept Comp Sci & Technol, Beijing, Peoples R China
[4] Singapore Management Univ, Singapore, Singapore
[5] Informat Res Ctr Mil Sci, Beijing, Peoples R China
[6] China Mobile Res Inst, Beijing, Peoples R China
[7] Tsinghua Univ, BNRist, CST, Beijing, Peoples R China
关键词
Knowledge Graph; Entity Alignment; Self-Supervised Learning; Contrastive Learning;
D O I
10.1145/3511808.3557364
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Self-supervised entity alignment (EA) aims to link equivalent entities across different knowledge graphs (KGs) without the use of pre-aligned entity pairs. The current state-of-the-art (SOTA) self-supervised EA approach draws inspiration from contrastive learning, originally designed in computer vision based on instance discrimination and contrastive loss, and suffers from two shortcomings. Firstly, it puts unidirectional emphasis on pushing sampled negative entities far away rather than pulling positively aligned pairs close, as is done in the well-established supervised EA. Secondly, it advocates the minimum information requirement for self-supervised EA, while we argue that self-described KG's side information (e.g., entity name, relation name, entity description) shall preferably be explored to the maximum extent for the self-supervised EA task. In this work, we propose an interactive contrastive learning model for self-supervised EA. It conducts bidirectional contrastive learning via building pseudo-aligned entity pairs as pivots to achieve direct cross-KG information interaction. It further exploits the integration of entity textual and structural information and elaborately designs encoders for better utilization in the self-supervised setting. Experimental results show that our approach outperforms the previous best self-supervised method by a large margin (over 9% Hits@1 absolute improvement on average) and performs on par with previous SOTA supervised counterparts, demonstrating the effectiveness of the interactive contrastive learning for self-supervised EA. The code and data are available at https://github.com/THU-KEG/ICLEA.
引用
收藏
页码:2465 / 2475
页数:11
相关论文
共 50 条
  • [31] CONTRASTIVE SELF-SUPERVISED LEARNING FOR WIRELESS POWER CONTROL
    Naderializadeh, Navid
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 4965 - 4969
  • [32] Contrastive Self-Supervised Learning for Skeleton Action Recognition
    Gao, Xuehao
    Yang, Yang
    Du, Shaoyi
    NEURIPS 2020 WORKSHOP ON PRE-REGISTRATION IN MACHINE LEARNING, VOL 148, 2020, 148 : 51 - 61
  • [33] Malicious Traffic Identification with Self-Supervised Contrastive Learning
    Yang, Jin
    Jiang, Xinyun
    Liang, Gang
    Li, Siyu
    Ma, Zicheng
    SENSORS, 2023, 23 (16)
  • [34] Self-Supervised Learning on Graphs: Contrastive, Generative, or Predictive
    Wu, Lirong
    Lin, Haitao
    Tan, Cheng
    Gao, Zhangyang
    Li, Stan Z.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (04) : 4216 - 4235
  • [35] Contrastive Self-Supervised Learning: A Survey on Different Architectures
    Khan, Adnan
    AlBarri, Sarah
    Manzoor, Muhammad Arslan
    PROCEEDINGS OF 2ND IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (ICAI 2022), 2022, : 1 - 6
  • [36] Self-Supervised Contrastive Learning for Unsupervised Phoneme Segmentation
    Kreuk, Felix
    Keshet, Joseph
    Adi, Yossi
    INTERSPEECH 2020, 2020, : 3700 - 3704
  • [37] Self-supervised contrastive learning for implicit collaborative filtering
    Song, Shipeng
    Liu, Bin
    Teng, Fei
    Li, Tianrui
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 139
  • [38] Memory Bank Clustering for Self-supervised Contrastive Learning
    Hao, Yiqing
    An, Gaoyun
    Ruan, Qiuqi
    IMAGE AND GRAPHICS TECHNOLOGIES AND APPLICATIONS, IGTA 2021, 2021, 1480 : 132 - 144
  • [39] Contrastive Self-supervised Learning in Recommender Systems: A Survey
    Jing, Mengyuan
    Zhu, Yanmin
    Zang, Tianzi
    Wang, Ke
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2024, 42 (02)
  • [40] Self-supervised contrastive representation learning for semantic segmentation
    Liu B.
    Cai H.
    Wang Y.
    Chen X.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (01): : 125 - 134