Beyond black-box models: explainable AI for embryo ploidy prediction and patient-centric consultation

被引:2
|
作者
Luong, Thi-My-Trang [1 ,2 ,3 ]
Ho, Nguyen-Tuong [3 ,4 ]
Hwu, Yuh-Ming [3 ]
Lin, Shyr-Yeu [3 ]
Ho, Jason Yen-Ping [3 ]
Wang, Ruey-Sheng [3 ]
Lee, Yi-Xuan [3 ]
Tan, Shun-Jen [3 ]
Lee, Yi-Rong [3 ]
Huang, Yung-Ling [3 ]
Hsu, Yi-Ching [3 ]
Le, Nguyen-Quoc-Khanh [2 ,5 ,6 ,7 ]
Tzeng, Chii-Ruey [3 ]
机构
[1] Taipei Med Univ, Coll Med, Int Master Program Med, Taipei, Taiwan
[2] Taipei Med Univ, AIBioMed Res Grp, Taipei, Taiwan
[3] Taipei Fertil Ctr, Taipei, Taiwan
[4] My Duc Hosp, IVFMD, Ho Chi Minh, Vietnam
[5] Taipei Med Univ, Coll Med, Profess Master Program Artificial Intelligence Med, Taipei, Taiwan
[6] Taipei Med Univ, Res Ctr Artificial Intelligence Med, Taipei, Taiwan
[7] Taipei Med Univ Hosp, Translat Imaging Res Ctr, Taipei, Taiwan
关键词
Embryo selection; Ploidy prediction; Explainable artificial intelligence; Machine learning; Preimplantation genetic testing; AGE;
D O I
10.1007/s10815-024-03178-7
中图分类号
Q3 [遗传学];
学科分类号
071007 ; 090102 ;
摘要
PurposeTo determine if an explainable artificial intelligence (XAI) model enhances the accuracy and transparency of predicting embryo ploidy status based on embryonic characteristics and clinical data.MethodsThis retrospective study utilized a dataset of 1908 blastocyst embryos. The dataset includes ploidy status, morphokinetic features, morphology grades, and 11 clinical variables. Six machine learning (ML) models including Random Forest (RF), Linear Discriminant Analysis (LDA), Logistic Regression (LR), Support Vector Machine (SVM), AdaBoost (ADA), and Light Gradient-Boosting Machine (LGBM) were trained to predict ploidy status probabilities across three distinct datasets: high-grade embryos (HGE, n = 1107), low-grade embryos (LGE, n = 364), and all-grade embryos (AGE, n = 1471). The model's performance was interpreted using XAI, including SHapley Additive exPlanations (SHAP) and Local Interpretable Model-agnostic Explanations (LIME) techniques.ResultsThe mean maternal age was 38.5 +/- 3.85 years. The Random Forest (RF) model exhibited superior performance compared to the other five ML models, achieving an accuracy of 0.749 and an AUC of 0.808 for AGE. In the external test set, the RF model achieved an accuracy of 0.714 and an AUC of 0.750 (95% CI, 0.702-0.796). SHAP's feature impact analysis highlighted that maternal age, paternal age, time to blastocyst (tB), and day 5 morphology grade significantly impacted the predictive model. In addition, LIME offered specific case-ploidy prediction probabilities, revealing the model's assigned values for each variable within a finite range.ConclusionThe model highlights the potential of using XAI algorithms to enhance ploidy prediction, optimize embryo selection as patient-centric consultation, and provides reliability and transparent insights into the decision-making process.
引用
收藏
页码:2349 / 2358
页数:10
相关论文
共 50 条
  • [31] Shedding Light on the Black Box: Integrating Prediction Models and Explainability Using Explainable Machine Learning
    Zhang, Yucheng
    Zheng, Yuyan
    Wang, Dan
    Gu, Xiaowei
    Zyphur, Michael J.
    Xiao, Lin
    Liao, Shudi
    Deng, Yangyang
    ORGANIZATIONAL RESEARCH METHODS, 2025,
  • [32] Explainable data-driven modeling via mixture of experts: Towards effective blending of gray and black-box models
    Leoni, Jessica
    Breschi, Valentina
    Formentin, Simone
    Tanelli, Mara
    AUTOMATICA, 2025, 173
  • [33] IHCP: interpretable hepatitis C prediction system based on black-box machine learning models
    Fan, Yongxian
    Lu, Xiqian
    Sun, Guicong
    BMC BIOINFORMATICS, 2023, 24 (01)
  • [34] Prediction of pH and microalgae growth in mixothrophic conditions by nonlinear black-box models for control purposes
    Paladino, Ombretta
    Neviani, Matteo
    Ciancio, Davide
    De Francesco, Maurizio
    BIOMASS CONVERSION AND BIOREFINERY, 2022, 14 (22) : 27967 - 27987
  • [35] IHCP: interpretable hepatitis C prediction system based on black-box machine learning models
    Yongxian Fan
    Xiqian Lu
    Guicong Sun
    BMC Bioinformatics, 24
  • [36] Crop yield prediction via explainable AI and interpretable machine learning: Dangers of black box models for evaluating climate change impacts on crop yield
    Hu, Tongxi
    Zhang, Xuesong
    Bohrer, Gil
    Liu, Yanlan
    Zhou, Yuyu
    Martin, Jay
    Li, Yang
    Zhao, Kaiguang
    AGRICULTURAL AND FOREST METEOROLOGY, 2023, 336
  • [37] Stop Using Black- Box Models: Application of Explainable Artificial Intelligence for Rate of Penetration Prediction
    Meng, Han
    Lin, Botao
    Jin, Yan
    SPE JOURNAL, 2024, 29 (12): : 6640 - 6654
  • [38] Using interpretability approaches to update "black-box" clinical prediction models: an external validation study in nephrology
    Cruz, Harry Freitas da
    Pfahringer, Boris
    Martensen, Tom
    Schneider, Frederic
    Meyer, Alexander
    Boettinger, Erwin
    Schapranow, Matthieu-P.
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2021, 111
  • [39] Scalable black-box prediction models for multi-dimensional adaptation on NUMA multi-cores
    Khasymski, Aleksandr
    Nikolopoulos, Dimitrios S.
    INTERNATIONAL JOURNAL OF PARALLEL EMERGENT AND DISTRIBUTED SYSTEMS, 2015, 30 (03) : 193 - 210
  • [40] ML-Stealer: Stealing Prediction Functionality of Machine Learning Models with Mere Black-Box Access
    Liu, Gaoyang
    Wang, Shijie
    Wan, Borui
    Wang, Zekun
    Wang, Chen
    2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 532 - 539