Pleural effusion diagnosis using local interpretable model-agnostic explanations and convolutional neural network

被引:0
|
作者
Nguyen H.T. [1 ]
Nguyen C.N.T. [1 ]
Phan T.M.N. [1 ]
Dao T.C. [1 ]
机构
[1] College of Information and Communication Technology (CICT), Can Tho University
关键词
Artificial intelligence; Chest X-ray (CXR) images; Computer-aided diagnosis; Disease prediction; Model explanation; Pleural effusion;
D O I
10.5573/IEIESPC.2021.10.2.101
中图分类号
学科分类号
摘要
The application of Artificial Intelligence (AI) in medicine has been a leading concern worldwide. Artificial intelligence-based systems not only support storing a large amount of data but also assist doctors in making a diagnosis. In addition, deep learning has obtained numerous achievements that greatly supported the development of image-based diagnostic methods. On the other hand, deep learning models still work as a black box that makes interpreting the output a challenge. Diagnosis based on images is currently a trend that plays a key role in clinical treatment by discovering abnormal regions for disease diagnosis. This paper proposes a computer-aid diagnosis system to support a pleural effusion diagnosis based on Chest X-ray (CXR) images. This study investigated several shallow convolutional neural network architectures that classify CXR images as well as the technique for processing imbalanced data using oversampling technology. The best model in the experiments was chosen to generate explanations using the Local Interpretable Model-agnostic Explanations (LIME) to support providing signals for pleural effusion diagnosis. The proposed method is expected to provide more informative CXR images of the pleural effusion diagnosis process. © 2021 The Institute of Electronics and Information Engineers
引用
收藏
页码:101 / 108
页数:7
相关论文
共 50 条
  • [41] Improving syngas yield and quality from biomass/coal co-gasification using cooperative game theory and local interpretable model-agnostic explanations
    Efremov, Cristina
    Le, Thanh Tuan
    Paramasivam, Prabhu
    Rudzki, Krzysztof
    Osman, Sameh Muhammad
    Chau, Thanh Hieu
    International Journal of Hydrogen Energy, 2024, 96 : 892 - 907
  • [42] Local interpretable model-agnostic explanations guided brain magnetic resonance imaging classification for identifying attention deficit hyperactivity disorder subtypes
    K. Usha Rupni
    P. Aruna Priya
    Journal of Ambient Intelligence and Humanized Computing, 2025, 16 (2) : 361 - 374
  • [43] ILIME: Local and Global Interpretable Model-Agnostic Explainer of Black-Box Decision
    ElShawi, Radwa
    Sherif, Youssef
    Al-Mallah, Mouaz
    Sakr, Sherif
    ADVANCES IN DATABASES AND INFORMATION SYSTEMS, ADBIS 2019, 2019, 11695 : 53 - 68
  • [44] Responsible Music Genre Classification Using Interpretable Model-Agnostic Visual Explainers
    Sudi Murindanyi
    Kyamanywa Hamza
    Sulaiman Kagumire
    Ggaliwango Marvin
    SN Computer Science, 6 (1)
  • [45] Individualized help for at-risk students using model-agnostic and counterfactual explanations
    Smith, Bevan, I
    Chimedza, Charles
    Buhrmann, Jacoba H.
    EDUCATION AND INFORMATION TECHNOLOGIES, 2022, 27 (02) : 1539 - 1558
  • [46] Constructing Interpretable Belief Rule Bases Using a Model-Agnostic Statistical Approach
    Sun, Chao
    Wang, Yinghui
    Yan, Tao
    Yang, Jinlong
    Huang, Liangyi
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2024, 32 (09) : 5163 - 5175
  • [47] Individualized help for at-risk students using model-agnostic and counterfactual explanations
    Bevan I. Smith
    Charles Chimedza
    Jacoba H. Bührmann
    Education and Information Technologies, 2022, 27 : 1539 - 1558
  • [48] Explainable machine learning techniques based on attention gate recurrent unit and local interpretable model-agnostic explanations for multivariate wind speed forecasting
    Peng, Lu
    Lv, Sheng-Xiang
    Wang, Lin
    JOURNAL OF FORECASTING, 2024, 43 (06) : 2064 - 2087
  • [49] Interpretative analyses for milling surface roughness prediction in thermally modified timber: Shapley value (SHAP) and local interpretable model-agnostic explanations (LIME)
    Huang, Wenlan
    Jin, Qingyang
    Guo, Xiaolei
    Na, Bin
    WOOD MATERIAL SCIENCE & ENGINEERING, 2025,
  • [50] Detection of COVID-19 findings by the local interpretable model-agnostic explanations method of types-based activations extracted from CNNs
    Togacar, Mesut
    Muzoglu, Nedim
    Ergen, Burhan
    Yarman, Bekir Siddik Binboga
    Halefoglu, Ahmet Mesrur
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 71