Evaluation of Explainable Deep Learning Methods for Ophthalmic Diagnosis

被引:23
|
作者
Singh, Amitojdeep [1 ,2 ]
Balaji, Janarthanam Jothi [3 ]
Rasheed, Mohammed Abdul [1 ]
Jayakumar, Varadharajan [1 ]
Raman, Rajiv [4 ]
Lakshminarayanan, Vasudevan [1 ,2 ]
机构
[1] Univ Waterloo, Sch Optometry & Vis Sci, Theoret & Expt Epistemol Lab TEEL, Waterloo, ON, Canada
[2] Univ Waterloo, Dept Syst Design Engn, Waterloo, ON, Canada
[3] Med Res Fdn, Dept Optometry, Chennai, Tamil Nadu, India
[4] Sankara Nethralaya, Shri Bhagwan Mahavir Vitreoretinal Serv, Chennai, Tamil Nadu, India
来源
CLINICAL OPHTHALMOLOGY | 2021年 / 15卷
基金
加拿大自然科学与工程研究理事会;
关键词
explainable AI; deep learning; machine learning; image processing; optical coherence tomography; retina; diabetic macular edema; choroidal neovascularization; drusen;
D O I
10.2147/OPTH.S312236
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Background: The lack of explanations for the decisions made by deep learning algorithms has hampered their acceptance by the clinical community despite highly accurate results on multiple problems. Attribution methods explaining deep learning models have been tested on medical imaging problems. The performance of various attribution methods has been compared for models trained on standard machine learning datasets but not on medical images. In this study, we performed a comparative analysis to determine the method with the best explanations for retinal OCT diagnosis. Methods: A well-known deep learning model, Inception-v3 was trained to diagnose 3 retinal diseases - choroidal neovascularization (CNV), diabetic macular edema (DME), and drusen. The explanations from 13 different attribution methods were rated by a panel of 14 clinicians for clinical significance. Feedback was obtained from the clinicians regarding the current and future scope of such methods. Results: An attribution method based on Taylor series expansion, called Deep Taylor, was rated the highest by clinicians with a median rating of 3.85/5. It was followed by Guided backpropagation (GBP), and SHapley Additive exPlanations (SHAP). Conclusion: Explanations from the top methods were able to highlight the structures for each disease - fluid accumulation for CNV, the boundaries of edema for DME, and bumpy areas of retinal pigment epithelium (RPE) for drusen. The most suitable method for a specific medical diagnosis task may be different from the one considered best for conventional tasks. Overall, there was a high degree of acceptance from the clinicians surveyed in the study.
引用
收藏
页码:2573 / 2581
页数:9
相关论文
共 50 条
  • [41] Explainable brain age prediction: a comparative evaluation of morphometric and deep learning pipelines
    De Bonis, Maria Luigia Natalia
    Fasano, Giuseppe
    Lombardi, Angela
    Ardito, Carmelo
    Ferrara, Antonio
    Di Sciascio, Eugenio
    Di Noia, Tommaso
    BRAIN INFORMATICS, 2024, 11 (01)
  • [42] Transparency in Diagnosis: Unveiling the Power of Deep Learning and Explainable AI for Medical Image Interpretation
    Garg, Priya
    Sharma, M. K.
    Kumar, Parteek
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2025,
  • [43] Improving trust and confidence in medical skin lesion diagnosis through explainable deep learning
    Metta, Carlo
    Beretta, Andrea
    Guidotti, Riccardo
    Yin, Yuan
    Gallinari, Patrick
    Rinzivillo, Salvatore
    Giannotti, Fosca
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2023,
  • [44] Explainable Deep Learning Reproduces a 'Professional Eye' on the Diagnosis of Internal Disorders in Persimmon Fruit
    Akagi, Takashi
    Onishi, Masanori
    Masuda, Kanae
    Kuroki, Ryohei
    Baba, Kohei
    Takeshita, Kouki
    Suzuki, Tetsuya
    Niikawa, Takeshi
    Uchida, Seiichi
    Ise, Takeshi
    PLANT AND CELL PHYSIOLOGY, 2020, 61 (11) : 1967 - 1973
  • [45] A Hybrid System Based on Bayesian Networks and Deep Learning for Explainable Mental Health Diagnosis
    Pavez, Juan
    Allende, Hector
    APPLIED SCIENCES-BASEL, 2024, 14 (18):
  • [46] An Explainable Deep-Learning Model to Aid in the Diagnosis of Age Related Macular Degeneration
    Herrero-Tudela, Maria
    Romero-Oraa, Roberto
    Hornero, Roberto
    Gutierrez-Tobal, Gonzalo C.
    Lopez, Maria, I
    Garcia, Maria
    9TH EUROPEAN MEDICAL AND BIOLOGICAL ENGINEERING CONFERENCE, VOL 1, EMBEC 2024, 2024, 112 : 85 - 94
  • [47] Advancing Ovarian Cancer Diagnosis Through Deep Learning and eXplainable AI: A Multiclassification Approach
    Radhakrishnan, Meera
    Sampathila, Niranjana
    Muralikrishna, H.
    Swathi, K. S.
    IEEE ACCESS, 2024, 12 : 116968 - 116986
  • [48] Editorial for special issue on explainable and generalizable deep learning methods for medical image computing
    Wang, Guotai
    Zhang, Shaoting
    Huang, Xiaolei
    Vercauteren, Tom
    Metaxas, Dimitris
    MEDICAL IMAGE ANALYSIS, 2023, 84
  • [49] Evaluating Explainable AI Methods in Deep Learning Models for Early Detection of Cerebral Palsy
    Pellano, Kimji N.
    Strumke, Inga
    Groos, Daniel
    Adde, Lars
    Ihlen, Espen F. Alexander
    IEEE ACCESS, 2025, 13 : 10126 - 10138
  • [50] Quantitative evaluation of Saliency-Based Explainable artificial intelligence (XAI) methods in Deep Learning-Based mammogram analysis
    Cerekci, Esma
    Alis, Deniz
    Denizoglu, Nurper
    Camurdan, Ozden
    Seker, Mustafa Ege
    Ozer, Caner
    Hansu, Muhammed Yusuf
    Tanyel, Toygar
    Oksuz, Ilkay
    Karaarslan, Ercan
    EUROPEAN JOURNAL OF RADIOLOGY, 2024, 173