KBioXLM: A Knowledge-anchored Biomedical Multilingual Pretrained Language Model

被引:0
|
作者
Geng, Lei [1 ]
Yan, Xu [1 ]
Cao, Ziqiang [1 ]
Li, Juntao [1 ]
Li, Wenjie [3 ]
Li, Sujian [2 ]
Zhou, Xinjie [4 ]
Yang, Yang [4 ]
Zhang, Jun [5 ]
机构
[1] Soochow Univ, Inst Artificial Intelligence, Suzhou, Peoples R China
[2] Peking Univ, Beijing, Peoples R China
[3] Hong Kong Polytech Univ, Hong Kong, Peoples R China
[4] Pharmcube, Beijing, Peoples R China
[5] Changping Lab, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
CORPUS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most biomedical pretrained language models are monolingual and cannot handle the growing cross-lingual requirements. The scarcity of non-English domain corpora, not to mention parallel data, poses a significant hurdle in training multilingual biomedical models. Since knowledge forms the core of domain-specific corpora and can be translated into various languages accurately, we propose a model called KBioXLM, which transforms the multilingual pretrained model XLM-R into the biomedical domain using a knowledge-anchored approach. We achieve a biomedical multilingual corpus by incorporating three granularity knowledge alignments (entity, fact, and passage levels) into monolingual corpora. Then we design three corresponding training tasks (entity masking, relation masking, and passage relation prediction) and continue training on top of the XLM-R model to enhance its domain crosslingual ability. To validate the effectiveness of our model, we translate the English benchmarks of multiple tasks into Chinese. Experimental results demonstrate that our model significantly outperforms monolingual and multilingual pretrained models in cross-lingual zero-shot and few-shot scenarios, achieving improvements of up to 10+ points. Our code is publicly available at https://github.com/ ngwlh-gl/KBioXLM.
引用
收藏
页码:11239 / 11250
页数:12
相关论文
共 50 条
  • [1] Multilingual LAMA: Investigating Knowledge in Multilingual Pretrained Language Models
    Kassner, Nora
    Dufter, Philipp
    Schutze, Hinrich
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3250 - 3258
  • [2] Distilling a Pretrained Language Model to a Multilingual ASR Model
    Choi, Kwanghee
    Park, Hyung-Min
    INTERSPEECH 2022, 2022, : 2203 - 2207
  • [3] Multilingual Knowledge Graph Completion from Pretrained Language Models with Knowledge Constraints
    Song, Ran
    He, Shizhu
    Gao, Shengxiang
    Cai, Li
    Liu, Kang
    Yu, Zhengtao
    Zhao, Jun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 7709 - 7721
  • [4] Incorporating entity-level knowledge in pretrained language model for biomedical dense retrieval
    Tan, Jiajie
    Hu, Jinlong
    Dong, Shoubin
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 166
  • [5] A Knowledge-Anchored Integrative Image Search and Retrieval System
    Selnur Erdal
    Umit V. Catalyurek
    Philip R. O. Payne
    Joel Saltz
    Jyoti Kamal
    Metin N. Gurcan
    Journal of Digital Imaging, 2009, 22 : 166 - 182
  • [6] Ensemble pretrained language models to extract biomedical knowledge from literature
    Li, Zhao
    Wei, Qiang
    Huang, Liang-Chin
    Li, Jianfu
    Hu, Yan
    Chuang, Yao-Shun
    He, Jianping
    Das, Avisha
    Keloth, Vipina Kuttichi
    Yang, Yuntao
    Diala, Chiamaka S.
    Roberts, Kirk E.
    Tao, Cui
    Jiang, Xiaoqian
    Zheng, W. Jim
    Xu, Hua
    JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 2024, 31 (09) : 1904 - 1911
  • [7] A Knowledge-Anchored Integrative Image Search and Retrieval System
    Erdal, Selnur
    Catalyurek, Umit V.
    Payne, Philip R. O.
    Saltz, Joel
    Kamal, Jyoti
    Gurcan, Metin N.
    JOURNAL OF DIGITAL IMAGING, 2009, 22 (02) : 166 - 182
  • [8] Factual Consistency of Multilingual Pretrained Language Models
    Fierro, Constanza
    Sogaard, Anders
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3046 - 3052
  • [9] X-FACTR: Multilingual Factual Knowledge Retrieval from Pretrained Language Models
    Zhengbao, Jiang
    Anastasopoulos, Antonios
    Jun, Araki
    Haibo, Ding
    Neubig, Graham
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 5943 - 5959
  • [10] mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models
    Ri, Ryokan
    Yamada, Ikuya
    Tsuruoka, Yoshimasa
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 7316 - 7330