Knowledge Graph Enhanced Transformers for Diagnosis Generation of Chinese Medicine

被引:0
|
作者
WANG Xin-yu [1 ]
YANG Tao [1 ,2 ,3 ]
GAO Xiao-yuan [1 ]
HU Kong-fa [1 ,3 ,4 ]
机构
[1] School of Artificial Intelligence and Information Technology, Nanjing University of Chinese Medicine
[2] School of Information Management , Nanjing University
[3] Jiangsu Collaborative Innovation Center of Traditional Chinese Medicine in Prevention and Treatment of Tumor
[4] Jiangsu Province Engineering Research Center of TCM Intelligence Health Service
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP391.1 [文字信息处理]; R2-03 [中医现代化研究];
学科分类号
081203 ; 0835 ; 100602 ;
摘要
Chinese medicine(CM) diagnosis intellectualization is one of the hotspots in the research of CM modernization. The traditional CM intelligent diagnosis models transform the CM diagnosis issues into classification issues, however, it is difficult to solve the problems such as excessive or similar categories.With the development of natural language processing techniques, text generation technique has become increasingly mature. In this study, we aimed to establish the CM diagnosis generation model by transforming the CM diagnosis issues into text generation issues. The semantic context characteristic learning capacity was enhanced referring to Bidirectional Long Short-Term Memory(BILSTM) with Transformer as the backbone network. Meanwhile, the CM diagnosis generation model Knowledge Graph Enhanced Transformer(KGET)was established by introducing the knowledge in medical field to enhance the inferential capability. The KGET model was established based on 566 CM case texts, and was compared with the classic text generation models including Long Short-Term Memory sequence-to-sequence(LSTM-seq2seq), Bidirectional and Auto-Regression Transformer(BART), and Chinese Pre-trained Unbalanced Transformer(CPT), so as to analyze the model manifestations. Finally, the ablation experiments were performed to explore the influence of the optimized part on the KGET model. The results of Bilingual Evaluation Understudy(BLEU), Recall-Oriented Understudy for Gisting Evaluation 1(ROUGE1), ROUGE2 and Edit distance of KGET model were 45.85, 73.93, 54.59 and 7.12, respectively in this study. Compared with LSTM-seq2seq, BART and CPT models, the KGET model was higher in BLEU, ROUGE1 and ROUGE2 by 6.00–17.09, 1.65–9.39 and 0.51–17.62, respectively, and lower in Edit distance by 0.47–3.21. The ablation experiment results revealed that introduction of BILSTM model and prior knowledge could significantly increase the model performance. Additionally, the manual assessment indicated that the CM diagnosis results of the KGET model used in this study were highly consistent with the practical diagnosis results. In conclusion, text generation technology can be effectively applied to CM diagnostic modeling. It can effectively avoid the problem of poor diagnostic performance caused by excessive and similar categories in traditional CM diagnostic classification models. CM diagnostic text generation technology has broad application prospects in the future.
引用
收藏
页码:267 / 276
页数:10
相关论文
共 50 条
  • [21] A Novel Chinese Traditional Medicine Prescription Recommendation System based on Knowledge Graph
    Wang, Yinghui
    2020 4TH INTERNATIONAL CONFERENCE ON CONTROL ENGINEERING AND ARTIFICIAL INTELLIGENCE (CCEAI 2020), 2020, 1487
  • [22] TCMKG: A Deep Learning Based Traditional Chinese Medicine Knowledge Graph Platform
    Zheng, Ziqiang
    Liu, Yongguo
    Zhang, Yun
    Wen, Chuanbiao
    11TH IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG 2020), 2020, : 560 - 564
  • [23] Question-Answering system based on the Knowledge Graph of Traditional Chinese Medicine
    Miao, Fang
    Wang, XueTing
    Zhang, Pu
    Jin, Libiao
    2019 11TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS (IHMSC 2019), VOL 2, 2019, : 264 - 267
  • [24] TCKGE: Transformers with contrastive learning for knowledge graph embedding
    Xiaowei Zhang
    Quan Fang
    Jun Hu
    Shengsheng Qian
    Changsheng Xu
    International Journal of Multimedia Information Retrieval, 2022, 11 : 589 - 597
  • [25] TCKGE: Transformers with contrastive learning for knowledge graph embedding
    Zhang, Xiaowei
    Fang, Quan
    Hu, Jun
    Qian, Shengsheng
    Xu, Changsheng
    INTERNATIONAL JOURNAL OF MULTIMEDIA INFORMATION RETRIEVAL, 2022, 11 (04) : 589 - 597
  • [26] Graph Reasoning Transformers for Knowledge -Aware Question Answering
    Zhao, Ruilin
    Zhao, Feng
    Hu, Liang
    Xu, Guandong
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17, 2024, : 19652 - 19660
  • [27] LET: Linguistic Knowledge Enhanced Graph Transformer for Chinese Short Text Matching
    Lyu, Boer
    Chen, Lu
    Zhu, Su
    Yu, Kai
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 13498 - 13506
  • [28] ScholarGraph :a Chinese Knowledge Graph of Chinese Scholars
    Wang, Shuo
    Hao, Zehui
    Meng, Xiaofeng
    Wang, Qiuyue
    PROCEEDINGS OF THE ELEVENTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2018), 2018, : 583 - 586
  • [29] Diagnosis and Treatment Knowledge Graph Modeling Application Based on Chinese Medical Records
    Wang, Jianghan
    Qu, Zhu
    Hu, Yihan
    Ling, Qiyun
    Yu, Jingyi
    Jiang, Yushan
    ELECTRONICS, 2023, 12 (16)
  • [30] Relation Detection with Transformers for Panoptic Scene Graph Generation
    Liu, Chang
    Yan, Wenchao
    Chen, Shilin
    Huang, Liqun
    Huang, Xiaotao
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT IV, 2025, 15034 : 223 - 238