A Weighted Flat Lattice Transformer-based Knowledge Extraction Architecture for Chinese Named Entity Recognition

被引:0
|
作者
Zhang, Hengwei [1 ]
Wu, Yuejia [1 ]
Zhou, Jian-tao [1 ]
机构
[1] Inner Mongolia Univ, Inner Mongolia Engn Lab Cloud Comp & Serv Softwar, Natl & Local Joint Engn Res Ctr Intelligent Infor, Minist Educ,Mongolian Engn Res Ctr Ecol Big Data, Hohhot, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge Graph; Knowledge Extraction; Chinese Named Entity Recognition; Flat Lattice Framework; Transformer Architecture;
D O I
10.1109/CSCWD61410.2024.10580614
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Named Entity Recognition (NER) is one of the contents of Knowledge Extraction (KE) that transforms data into knowledge representation. However, Chinese NER faces the problem of lacking clear word boundaries that limit the effectiveness of the KE. Although the flat lattice Transformer (FLAT) framework, which converts lattice structure into a flat structure including a set of spans, can effectively improve this problem and obtain advanced results, there still exist the problems of insensitivity to entity importance weights and insufficient feature learning. This paper proposes a weighted flat lattice Transformer architecture for Chinese NER, namely WFLAT. The WFLAT first adds a weight matrix into self-attention calculation, which can obtain finer-grained partitioning of entities to improve experimental performance, and then adopts a multi-layer Transformer encoder with each layer using a multi-head self-attention mechanism. Extensive experimental results on benchmarks demonstrate that our proposed KE model can obtain state-of-the-art performance for the Chinese NER task.
引用
收藏
页码:193 / 198
页数:6
相关论文
共 50 条
  • [31] T-NER: An All-Round Python']Python Library for Transformer-based Named Entity Recognition
    Ushio, Asahi
    Camacho-Collados, Jose
    EACL 2021: THE 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: PROCEEDINGS OF THE SYSTEM DEMONSTRATIONS, 2021, : 53 - 62
  • [32] Named Entity Recognition of Power Substation Knowledge Based on Transformer-BiLSTM-CRF Network
    Yang, Q. Y.
    Jiang, J.
    Feng, X. Y.
    He, J. M.
    Chen, B. R.
    Zhang, Z. Y.
    2020 INTERNATIONAL CONFERENCE ON SMART GRIDS AND ENERGY SYSTEMS (SGES 2020), 2020, : 952 - 956
  • [33] Chinese Named Entity Recognition and Disambiguation Based on Wikipedia
    Yu Miao
    Lv Yajuan
    Liu Qun
    Su Jinsong
    Xiong Hao
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, 2012, 333 : 272 - 283
  • [34] Knowledge-based Named Entity Recognition in Polish
    Pohl, Aleksander
    2013 FEDERATED CONFERENCE ON COMPUTER SCIENCE AND INFORMATION SYSTEMS (FEDCSIS), 2013, : 145 - 151
  • [35] Research on Chinese Named Entity Recognition Based on Ontology
    Chang, Weili
    Luo, Fang
    Qian, Jilai
    MECHANICAL ENGINEERING AND INTELLIGENT SYSTEMS, PTS 1 AND 2, 2012, 195-196 : 1180 - 1185
  • [36] Chinese Chemical Named Entity Recognition Based on Morpheme
    Wang, Guirong
    Xia, Bo
    Xiao, Ye
    Rao, Gaoqi
    Xun, Endong
    2020 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP 2020), 2020, : 247 - 252
  • [37] Chinese named entity recognition model based on BERT
    Liu, Hongshuai
    Jun, Ge
    Zheng, Yuanyuan
    2020 2ND INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE COMMUNICATION AND NETWORK SECURITY (CSCNS2020), 2021, 336
  • [38] Chinese Named Entity Recognition Using the Improved Transformer Encoder and the Lexicon Adapter
    Sun, Mingjie
    Wang, Lisong
    Sheng, Tianye
    He, Zongfeng
    Huang, Yuhua
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT II, 2022, 13530 : 197 - 208
  • [39] Chinese Fine-Grained Geological Named Entity Recognition With Rules and FLAT
    Chen, Siying
    Hua, Weihua
    Liu, Xiuguo
    Deng, Xiaotong
    Zeng, Xinling
    Duan, Jianchao
    EARTH AND SPACE SCIENCE, 2022, 9 (12)
  • [40] Named Entity Recognition and Event Extraction in Chinese Electronic Medical Records
    Ma, Cheng
    Huang, Wenkang
    CCKS 2021 - EVALUATION TRACK, 2022, 1553 : 133 - 138