Deep Entity Linking via Eliminating Semantic Ambiguity With BERT

被引:18
|
作者
Yin, Xiaoyao [1 ]
Huang, Yangchen [1 ]
Zhou, Bin [1 ]
Li, Aiping [1 ]
Lan, Long [1 ,2 ]
Jia, Yan [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp, Changsha 410073, Peoples R China
[2] Natl Univ Def Technol, State Key Lab High Performance Comp, Changsha 410073, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
基金
中国国家自然科学基金;
关键词
Task analysis; Knowledge based systems; Semantics; Joining processes; Bit error rate; Natural languages; Computational modeling; Entity linking; natural language processing (NLP); bidirectional encoder representations from transformers (BERT); deep neural network (DNN);
D O I
10.1109/ACCESS.2019.2955498
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Entity linking refers to the task of aligning mentions of entities in the text to their corresponding entries in a specific knowledge base, which is of great significance for many natural language process applications such as semantic text understanding and knowledge fusion. The pivotal of this problem is how to make effective use of contextual information to disambiguate mentions. Moreover, it has been observed that, in most cases, mention has similar or even identical strings to the entity it refers to. To prevent the model from linking mentions to entities with similar strings rather than the semantically similar ones, in this paper, we introduce the advanced language representation model called BERT (Bidirectional Encoder Representations from Transformers) and design a hard negative samples mining strategy to fine-tune it accordingly. Based on the learned features, we obtain the valid entity through computing the similarity between the textual clues of mentions and the entity candidates in the knowledge base. The proposed hard negative samples mining strategy benefits entity linking from the larger, more expressive pre-trained representations of BERT with limited training time and computing sources. To the best of our knowledge, we are the first to equip entity linking task with the powerful pre-trained general language model by deliberately tackling its potential shortcoming of learning literally, and the experiments on the standard benchmark datasets show that the proposed model yields state-of-the-art results.
引用
收藏
页码:169434 / 169445
页数:12
相关论文
共 50 条
  • [41] Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features
    Hu, Shengze
    Tan, Zhen
    Zeng, Weixin
    Ge, Bin
    Xiao, Weidong
    SYMMETRY-BASEL, 2019, 11 (04):
  • [42] Semantic Search via Entity-Types: The SEMANNOREX Framework
    Kumar, Amit
    Govind
    Spaniol, Marc
    WEB CONFERENCE 2021: COMPANION OF THE WORLD WIDE WEB CONFERENCE (WWW 2021), 2021, : 690 - 694
  • [43] An Entity Linking Algorithm Derived from Graph Convolutional Network and Contextualized Semantic Relevance
    Jia, Bingjing
    Wang, Chenglong
    Zhao, Haiyan
    Shi, Lei
    SYMMETRY-BASEL, 2022, 14 (10):
  • [44] Reducing semantic ambiguity in domain adaptive semantic segmentation via probabilistic prototypical pixel contrast
    Hao, Xiaoke
    Liu, Shiyu
    Feng, Chuanbo
    Zhu, Ye
    NEURAL NETWORKS, 2025, 181
  • [45] NeuPL: Attention-based Semantic Matching and Pair-Linking for Entity Disambiguation
    Phan, Minh C.
    Sun, Aixin
    Tay, Yi
    Han, Jialong
    Li, Chenliang
    CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 1667 - 1676
  • [46] Adaptive Genetic Programming Based Linkage Rule Miner For Entity Linking In Semantic Web
    Singh, Amit
    Sharan, Aditi
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND AUTOMATION (ICCCA), 2017, : 373 - 378
  • [47] Classical Arabic Named Entity Recognition Using Variant Deep Neural Network Architectures and BERT
    Alsaaran, Norah
    Alrabiah, Maha
    IEEE ACCESS, 2021, 9 : 91537 - 91547
  • [48] Collective entity linking via greedy search and Monte Carlo calculation
    Chen, Lei
    Wu, Chong
    INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING, 2019, 20 (01) : 59 - 68
  • [49] Entity Linking via Explicit Mention-Mention Coreference Modeling
    Agarwal, Dhruv
    Angell, Rico
    Monath, Nicholas
    McCallum, Andrew
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 4644 - 4658
  • [50] Semantic ambiguity and the failure of inhibition hypothesis as an explanation for reading errors in deep dyslexia
    Colangelo, A
    Buchanan, L
    BRAIN AND COGNITION, 2005, 57 (01) : 39 - 42