Evaluation of Explainable Deep Learning Methods for Ophthalmic Diagnosis

被引:23
|
作者
Singh, Amitojdeep [1 ,2 ]
Balaji, Janarthanam Jothi [3 ]
Rasheed, Mohammed Abdul [1 ]
Jayakumar, Varadharajan [1 ]
Raman, Rajiv [4 ]
Lakshminarayanan, Vasudevan [1 ,2 ]
机构
[1] Univ Waterloo, Sch Optometry & Vis Sci, Theoret & Expt Epistemol Lab TEEL, Waterloo, ON, Canada
[2] Univ Waterloo, Dept Syst Design Engn, Waterloo, ON, Canada
[3] Med Res Fdn, Dept Optometry, Chennai, Tamil Nadu, India
[4] Sankara Nethralaya, Shri Bhagwan Mahavir Vitreoretinal Serv, Chennai, Tamil Nadu, India
来源
CLINICAL OPHTHALMOLOGY | 2021年 / 15卷
基金
加拿大自然科学与工程研究理事会;
关键词
explainable AI; deep learning; machine learning; image processing; optical coherence tomography; retina; diabetic macular edema; choroidal neovascularization; drusen;
D O I
10.2147/OPTH.S312236
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Background: The lack of explanations for the decisions made by deep learning algorithms has hampered their acceptance by the clinical community despite highly accurate results on multiple problems. Attribution methods explaining deep learning models have been tested on medical imaging problems. The performance of various attribution methods has been compared for models trained on standard machine learning datasets but not on medical images. In this study, we performed a comparative analysis to determine the method with the best explanations for retinal OCT diagnosis. Methods: A well-known deep learning model, Inception-v3 was trained to diagnose 3 retinal diseases - choroidal neovascularization (CNV), diabetic macular edema (DME), and drusen. The explanations from 13 different attribution methods were rated by a panel of 14 clinicians for clinical significance. Feedback was obtained from the clinicians regarding the current and future scope of such methods. Results: An attribution method based on Taylor series expansion, called Deep Taylor, was rated the highest by clinicians with a median rating of 3.85/5. It was followed by Guided backpropagation (GBP), and SHapley Additive exPlanations (SHAP). Conclusion: Explanations from the top methods were able to highlight the structures for each disease - fluid accumulation for CNV, the boundaries of edema for DME, and bumpy areas of retinal pigment epithelium (RPE) for drusen. The most suitable method for a specific medical diagnosis task may be different from the one considered best for conventional tasks. Overall, there was a high degree of acceptance from the clinicians surveyed in the study.
引用
收藏
页码:2573 / 2581
页数:9
相关论文
共 50 条
  • [21] Toward Explainable Deep Learning
    Balasubramanian, Vineeth N.
    COMMUNICATIONS OF THE ACM, 2022, 65 (11) : 68 - 69
  • [22] Ophthalmic diagnosis using deep learning with fundus images - A critical review
    Sengupta, Sourya
    Singh, Amitojdeep
    Leopold, Henry A.
    Gulati, Tanmay
    Lakshminarayanan, Vasudevan
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2020, 102
  • [23] Advancements in Deep Learning for Automated Diagnosis of Ophthalmic Diseases: A Comprehensive Review
    Dash, Shreemat Kumar
    Sethy, Prabira Kumar
    Das, Ashis
    Jena, Sudarson
    Nanthaamornphong, Aziz
    IEEE ACCESS, 2024, 12 : 171221 - 171240
  • [24] Comparison of veterinarians and a deep learning tool in the diagnosis of equine ophthalmic diseases
    Scharre, Annabel
    Scholler, Dominik
    Gesell-May, Stefan
    Mueller, Tobias
    Zablotski, Yury
    Ertel, Wolfgang
    May, Anna
    EQUINE VETERINARY JOURNAL, 2025, 57 (01) : 47 - 53
  • [25] Application of Deep Learning Methods in a Moroccan Ophthalmic Center: Analysis and Discussion
    Farahat, Zineb
    Zrira, Nabila
    Souissi, Nissrine
    Benamar, Safia
    Belmekki, Mohammed
    Ngote, Mohamed Nabil
    Megdiche, Kawtar
    DIAGNOSTICS, 2023, 13 (10)
  • [26] Deep Learning Assisted Imaging Methods to Facilitate Access to Ophthalmic Telepathology
    Browne, Andrew W.
    Kim, Geunwoo
    Vu, Anderson N.
    To, Josiah K.
    Minckler, Don S.
    Del Valle Estopinal, Maria
    Rao, Narsing A.
    Curcio, Christine A.
    Baldi, Pierre F.
    OPHTHALMOLOGY SCIENCE, 2024, 4 (03):
  • [27] An Explainable AI Paradigm for Alzheimer's Diagnosis Using Deep Transfer Learning
    Mahmud, Tanjim
    Barua, Koushick
    Habiba, Sultana Umme
    Sharmen, Nahed
    Hossain, Mohammad Shahadat
    Andersson, Karl
    DIAGNOSTICS, 2024, 14 (03)
  • [28] Enhancing Psychologists' Understanding Through Explainable Deep Learning Framework for ADHD Diagnosis
    Rehman, Abdul
    Lin, Jerry Chun-Wei
    Heldal, Ilona
    EXPERT SYSTEMS, 2025, 42 (02)
  • [29] Diagnosis of Paratuberculosis in Histopathological Images Based on Explainable Artificial Intelligence and Deep Learning
    Yigit, Tuncay
    Sengoz, Nilgun
    Ozmen, Ozlem
    Hemanth, Jude
    Isik, Ali Hakan
    TRAITEMENT DU SIGNAL, 2022, 39 (03) : 863 - 869
  • [30] Explainable AI for Retinoblastoma Diagnosis: Interpreting Deep Learning Models with LIME and SHAP
    Aldughayfiq, Bader
    Ashfaq, Farzeen
    Jhanjhi, N. Z.
    Humayun, Mamoona
    DIAGNOSTICS, 2023, 13 (11)