Fine-tuning BERT for Joint Entity and Relation Extraction in Chinese Medical Text

被引:0
|
作者
Xue, Kui [1 ]
Zhou, Yangming [1 ]
Ma, Zhiyuan [1 ]
Ruan, Tong [1 ]
Zhang, Huanhuan [1 ]
He, Ping [2 ]
机构
[1] East China Univ Sci & Technol, Sch Informat Sci & Engn, Shanghai 200237, Peoples R China
[2] Shanghai Hosp Dev Ctr, Shanghai 200041, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Named entity recognition; Relation classification; Joint model; BERT language model; Electronic health records;
D O I
暂无
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Entity and relation extraction is the necessary step in structuring medical text. However, the feature extraction ability of the bidirectional long short term memory network in the existing model does not achieve the best effect. At the same time, the language model has achieved excellent results in more and more natural language processing tasks. In this paper, we present a focused attention model for the joint entity and relation extraction task. Our model integrates well-known BERT language model into joint learning through dynamic range attention mechanism, thus improving the feature representation ability of shared parameter layer. Experimental results on coronary angiography texts collected from Shuguang Hospital show that the F-1-scores of named entity recognition and relation classification tasks reach 96.89% and 88.51%, which outperform state-of-the-art methods by 1.65% and 1.22%, respectively.
引用
收藏
页码:892 / 897
页数:6
相关论文
共 50 条
  • [21] Compressing BERT for Binary Text Classification via Adaptive Truncation before Fine-Tuning
    Zhang, Xin
    Fan, Jing
    Hei, Mengzhe
    APPLIED SCIENCES-BASEL, 2022, 12 (23):
  • [22] Fine-Tuning of Distil-BERT for Continual Learning in Text Classification: An Experimental Analysis
    Shah, Sahar
    Manzoni, Sara Lucia
    Zaman, Farooq
    Es Sabery, Fatima
    Epifania, Francesco
    Zoppis, Italo Francesco
    IEEE ACCESS, 2024, 12 : 104964 - 104982
  • [23] Extreme Fine-tuning: A Novel and Fast Fine-tuning Approach for Text Classification
    Jiaramaneepinit, Boonnithi
    Chay-intr, Thodsaporn
    Funakoshi, Kotaro
    Okumura, Manabu
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 368 - 379
  • [24] A BERT Fine-tuning Model for Targeted Sentiment Analysis of Chinese Online Course Reviews
    Zhang, Huibing
    Dong, Junchao
    Min, Liang
    Bi, Peng
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2020, 29 (7-8)
  • [25] Dataset Distillation with Attention Labels for Fine-tuning BERT
    Maekawa, Aru
    Kobayashi, Naoki
    Funakoshi, Kotaro
    Okumura, Manabu
    61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 119 - 127
  • [26] Fine-Tuning BERT for Generative Dialogue Domain Adaptation
    Labruna, Tiziano
    Magnini, Bernardo
    TEXT, SPEECH, AND DIALOGUE (TSD 2022), 2022, 13502 : 513 - 524
  • [27] Patent classification by fine-tuning BERT language model
    Lee, Jieh-Sheng
    Hsiang, Jieh
    WORLD PATENT INFORMATION, 2020, 61
  • [28] Noise Stability Regularization for Improving BERT Fine-tuning
    Hua, Hang
    Li, Xingjian
    Dou, Dejing
    Xu, Chengzhong
    Luo, Jiebo
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 3229 - 3241
  • [29] A Closer Look at How Fine-tuning Changes BERT
    Zhou, Yichu
    Srikumar, Vivek
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1046 - 1061
  • [30] IsoBN: Fine-Tuning BERT with Isotropic Batch Normalization
    Zhou, Wenxuan
    Lin, Bill Yuchen
    Ren, Xiang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14621 - 14629