ENHANCING TONGUE REGION SEGMENTATION THROUGH SELF-ATTENTION AND TRANSFORMER BASED

被引:0
|
作者
Song, Yihua [1 ,2 ]
Li, Can [1 ,2 ]
Zhang, Xia [1 ,2 ]
Liu, Zhen [3 ]
Song, Ningning [4 ]
Zhou, Zuojian [1 ,2 ]
机构
[1] Nanjing Univ Chinese Med, Sch Articial Intelligence & Informat Technol, Nanjing 210003, Peoples R China
[2] Nanjing Univ Chinese Med, Jiangsu Prov Engn Res Ctr TCM Intelligence Hlth Se, Nanjing, Peoples R China
[3] Nanjing Univ Chinese Med, Sch Med Humanities, Nanjing 210003, Peoples R China
[4] Nanjing First Hosp, Nanjing 210003, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Deep learning; transformer; harnessing self-attention tongue segmentation; tongue segmentation;
D O I
10.1142/S0219519424400098
中图分类号
Q6 [生物物理学];
学科分类号
071011 ;
摘要
As an essential component of traditional Chinese medicine diagnosis, tongue diagnosis has faced limitations in clinical practice due to its subjectivity and reliance on the experience of physicians. Recent advancements in deep learning techniques have opened new possibilities for the automated analysis and diagnosis of tongue images. In this paper, we collected 500 tongue images from various patients. These images were initially preprocessed and annotated, resulting in the dataset used for this experiment. This project is based on the previously proposed segmentation method using Harnessing Self-Attention and Transformer, which is divided into three key stages: feature extraction, feature fusion, and segmentation prediction. By organically combining these three key stages, our tongue region segmentation model is better equipped to handle complex tongue images and provides accurate segmentation results. The segmentation DICE coefficient reaches 0.953, which is of significant importance for the automation and objectivity of tongue diagnosis in traditional Chinese medicine.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Enhancing Self-Attention with Knowledge-Assisted Attention Maps
    Bai, Jiangang
    Wang, Yujing
    Sun, Hong
    Wu, Ruonan
    Yang, Tianmeng
    Tang, Pengfei
    Cao, Defu
    Zhang, Mingliang
    Tong, Yunhai
    Yang, Yaming
    Bai, Jing
    Zhang, Ruofei
    Sun, Hao
    Shen, Wei
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 107 - 115
  • [32] Local self-attention in transformer for visual question answering
    Shen, Xiang
    Han, Dezhi
    Guo, Zihan
    Chen, Chongqing
    Hua, Jie
    Luo, Gaofeng
    APPLIED INTELLIGENCE, 2023, 53 (13) : 16706 - 16723
  • [33] Local self-attention in transformer for visual question answering
    Xiang Shen
    Dezhi Han
    Zihan Guo
    Chongqing Chen
    Jie Hua
    Gaofeng Luo
    Applied Intelligence, 2023, 53 : 16706 - 16723
  • [34] Tree Transformer: Integrating Tree Structures into Self-Attention
    Wang, Yau-Shian
    Lee, Hung-Yi
    Chen, Yun-Nung
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1061 - 1070
  • [35] Efficient memristor accelerator for transformer self-attention functionality
    Bettayeb, Meriem
    Halawani, Yasmin
    Khan, Muhammad Umair
    Saleh, Hani
    Mohammad, Baker
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [36] A lightweight transformer with linear self-attention for defect recognition
    Zhai, Yuwen
    Li, Xinyu
    Gao, Liang
    Gao, Yiping
    ELECTRONICS LETTERS, 2024, 60 (17)
  • [37] Transformer with sparse self-attention mechanism for image captioning
    Wang, Duofeng
    Hu, Haifeng
    Chen, Dihu
    ELECTRONICS LETTERS, 2020, 56 (15) : 764 - +
  • [38] An efficient parallel self-attention transformer for CSI feedback
    Liu, Ziang
    Song, Tianyu
    Zhao, Ruohan
    Jin, Jiyu
    Jin, Guiyue
    PHYSICAL COMMUNICATION, 2024, 66
  • [39] Transformer Self-Attention Network for Forecasting Mortality Rates
    Roshani, Amin
    Izadi, Muhyiddin
    Khaledi, Baha-Eldin
    JIRSS-JOURNAL OF THE IRANIAN STATISTICAL SOCIETY, 2022, 21 (01): : 81 - 103
  • [40] Keyword Transformer: A Self-Attention Model for Keyword Spotting
    Berg, Axel
    O'Connor, Mark
    Cruz, Miguel Tairum
    INTERSPEECH 2021, 2021, : 4249 - 4253