Sequential lexicon enhanced bidirectional encoder representations from transformers: Chinese named entity recognition using sequential lexicon enhanced BERT

被引:0
|
作者
Liu, Xin [1 ]
Zhao, Jiashan [2 ]
Yao, Junping [1 ]
Zheng, Hao [1 ]
Wang, Zhong [1 ]
机构
[1] Xian Res Inst High Tech, Dept Basic, Xian, Shaanxi, Peoples R China
[2] Changan Univ, Dept Informat & Network Management, Xian, Shaanxi, Peoples R China
关键词
Chinese NER; Lexical enhancement; BERT; Adaptive attention;
D O I
10.7717/peerj-cs.2344
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Lexicon Enhanced Bidirectional Encoder Representations from Transformers (LEBERT) has achieved great success in Chinese Named Entity Recognition (NER). LEBERT performs lexical enhancement with a Lexicon Adapter layer, which facilitates deep lexicon knowledge fusion at the lower layers of BERT. However, this method is likely to introduce noise words and does not consider the possible conflicts between words when fusing lexicon information. To address this issue, we advocate for a novel lexical enhancement method, Sequential Lexicon Enhanced BERT (SLEBERT) for the Chinese NER, which builds sequential lexicon to reduce noise words and resolve the problem of lexical conflict. Compared with LEBERT, it leverages the position encoding of sequential lexicon and adaptive attention mechanism of sequential lexicon to enhance the lexicon feature. Experiments on the four available datasets identified that SLEBERT outperforms other lexical enhancement models in performance and efficiency.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] A chinese named entity recognition method for small-scale dataset based on lexicon and unlabeled data
    Huang, Shaobin
    Sha, Yongpeng
    Li, Rongsheng
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (02) : 2185 - 2206
  • [42] Research on Named Entity Recognition in Ancient Chinese Based on Incremental Pre-training and Domain Lexicon
    Kang, Wenjun
    Zuo, Jiali
    Dai, Qili
    Hu, Yiyu
    Wang, Mingwen
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT I, NLPCC 2024, 2025, 15359 : 483 - 503
  • [43] NABP-BERT: NANOBODY®-antigen binding prediction based on bidirectional encoder representations from transformers (BERT) architecture
    Ahmed, Fatma S.
    Aly, Saleh
    Liu, Xiangrong
    BRIEFINGS IN BIOINFORMATICS, 2024, 26 (01)
  • [44] Enhanced Cascading Recognition with Positional Labels for Chinese Medicine Named Entity
    Wang, Xuyang
    Zhao, Lijie
    Zhang, Jiyuan
    Computer Engineering and Applications, 2024, 60 (02) : 121 - 128
  • [45] Improving case duration accuracy of orthopedic surgery using bidirectional encoder representations from Transformers (BERT) on Radiology Reports
    William Zhong
    Phil Y. Yao
    Sri Harsha Boppana
    Fernanda V. Pacheco
    Brenton S. Alexander
    Sierra Simpson
    Rodney A. Gabriel
    Journal of Clinical Monitoring and Computing, 2024, 38 (1) : 221 - 228
  • [46] A Comparative Sentiment Analysis of Airline Customer Reviews Using Bidirectional Encoder Representations from Transformers (BERT) and Its Variants
    Li, Zehong
    Yang, Chuyang
    Huang, Chenyu
    MATHEMATICS, 2024, 12 (01)
  • [47] Improving case duration accuracy of orthopedic surgery using bidirectional encoder representations from Transformers (BERT) on Radiology Reports
    Zhong, William
    Yao, Phil Y.
    Boppana, Sri Harsha
    Pacheco, Fernanda V.
    Alexander, Brenton S.
    Simpson, Sierra
    Gabriel, Rodney A.
    JOURNAL OF CLINICAL MONITORING AND COMPUTING, 2024, 38 (01) : 221 - 228
  • [48] MalBERT: Malware Detection using Bidirectional Encoder Representations from Transformers
    Rahali, Abir
    Akhloufi, Moulay A.
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 3226 - 3231
  • [49] Cross-domain sentiment classification using decoding-enhanced bidirectional encoder representations from transformers with disentangled attention
    Singh, Rahul Kumar
    Sachan, Manoj Kumar
    Patel, Ram Bahadur
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (06): : 1
  • [50] exBAKE: Automatic Fake News Detection Model Based on Bidirectional Encoder Representations from Transformers (BERT)
    Jwa, Heejung
    Oh, Dongsuk
    Park, Kinam
    Kang, Jang Mook
    Lim, Heuiseok
    APPLIED SCIENCES-BASEL, 2019, 9 (19):