Evaluating ChatGPT's Ability to Answer Common Patient Questions Regarding Hip Fracture

被引:3
|
作者
Wrenn, Sean P. [1 ]
Mika, Aleksander P. [1 ]
Ponce, Robert B. [1 ]
Mitchell, Phillip M. [1 ]
机构
[1] Vanderbilt Univ Sch Med, Dept Orthopaed Surg, Nashville, TN 37232 USA
关键词
HEALTH INFORMATION;
D O I
10.5435/JAAOS-D-23-00877
中图分类号
R826.8 [整形外科学]; R782.2 [口腔颌面部整形外科学]; R726.2 [小儿整形外科学]; R62 [整形外科学(修复外科学)];
学科分类号
摘要
INTRODUCTION:ChatGPT is an artificial intelligence chatbot software programmed for conversational applications using reinforcement learning techniques. With its growing popularity and overall versatility, it is likely that ChatGPT's applications will expand into health care especially because it relates to patients researching their injuries. The purpose of this study was to investigate ChatGPT's ability to accurately answer frequently asked questions regarding hip fractures.Methods:Eleven frequently asked questions regarding hip fractures were posed to ChatGPT, and the responses were recorded in full. Five of these questions were determined to be high-yield based on the likelihood that a patient would ask the question to a chatbot software. The chatbot's responses were analyzed by five fellowship-trained orthopaedic trauma surgeons for their quality and accuracy using an evidence-based approach. The chatbot's answers were rated as "Excellent response requiring no clarification", "Satisfactory response requiring minimal clarification", "Satisfactory response requiring moderate clarification", or "Unsatisfactory response requiring significant clarification."Results:Of the five high-yield questions posed to the chatbot, no question was determined to be unsatisfactory requiring significant clarification by the authors. The remaining responses were either satisfactory requiring minimal clarification (n = 3) or satisfactory requiring moderate clarification (n = 2).Discussion:The chatbot was generally found to provide unbiased and evidence-based answers that would be clearly understood by most orthopaedic patients. These findings suggest that ChatGPT has the potential to be an effective patient education tool especially because it continues to grow and improve as a chatbot application.Level of evidence:Level IV study.
引用
收藏
页码:656 / 659
页数:4
相关论文
共 50 条
  • [1] Evaluating if ChatGPT Can Answer Common Patient Questions Compared With OrthoInfo Regarding Rotator Cuff Tears
    Jurayj, Alexander
    Nerys-Figueroa, Julio
    Espinal, Emil
    Gaudiani, Michael A.
    Baes, Travis
    Mahylis, Jared
    Muh, Stephanie
    JOURNAL OF THE AMERICAN ACADEMY OF ORTHOPAEDIC SURGEONS GLOBAL RESEARCH AND REVIEWS, 2025, 9 (03):
  • [2] Assessing ChatGPT Responses to Common Patient Questions Regarding Total Hip Arthroplasty
    Mika, Aleksander P.
    Martin, J. Ryan
    Engstrom, Stephen M.
    Polkowski, Gregory G.
    Wilson, Jacob M.
    JOURNAL OF BONE AND JOINT SURGERY-AMERICAN VOLUME, 2023, 105 (19): : 1519 - 1526
  • [3] Evaluating the Accuracy of ChatGPT in Common Patient Questions Regarding HPV plus Oropharyngeal Carcinoma
    Bellamkonda, Nikhil
    Farlow, Janice L.
    Haring, Catherine T.
    Sim, Michael W.
    Seim, Nolan B.
    Cannon, Richard B.
    Monroe, Marcus M.
    Agrawal, Amit
    Rocco, James W.
    McCrary, Hilary C.
    ANNALS OF OTOLOGY RHINOLOGY AND LARYNGOLOGY, 2024, 133 (09): : 814 - 819
  • [4] Can ChatGPT answer patient questions regarding reverse shoulder arthroplasty?
    Lack, Benjamin T.
    Mouhawasse, Edwin
    Childers, Justin T.
    Jackson, Garrett R.
    Daji, Shay V.
    Yerke-Hansen, Payton
    Familiari, Filippo
    Knapik, Derrick M.
    Sabesan, Vani J.
    JOURNAL OF ISAKOS JOINT DISORDERS & ORTHOPAEDIC SPORTS MEDICINE, 2024, 9 (06)
  • [5] Can ChatGPT Answer Patient Questions Regarding Total Knee Arthroplasty?
    Mika, Aleksander P.
    Mulvey, Hillary E.
    Engstrom, Stephen M.
    Polkowski, Gregory G.
    Martin, J. Ryan
    Wilson, Jacob M.
    JOURNAL OF KNEE SURGERY, 2024, 37 (09) : 664 - 673
  • [6] Evaluating ChatGPT ability to answer urinary tract Infection-Related questions
    Cakir, Hakan
    Caglar, Ufuk
    Sekkeli, Sami
    Zerdali, Esra
    Sarilar, Omer
    Yildiz, Oguzhan
    Ozgor, Faruk
    INFECTIOUS DISEASES NOW, 2024, 54 (04):
  • [7] Evaluating ChatGPT's performance in answering common patient questions on cervical cancer
    Do, Anthony
    Li, Andrew
    Smith, Haller
    Chambers, Laura
    Esselen, Kate
    Liang, Margaret
    GYNECOLOGIC ONCOLOGY, 2024, 190 : S376 - S376
  • [8] Appraisal of ChatGPT's responses to common patient questions regarding Tommy John surgery
    Shaari, Ariana L.
    Fano, Adam N.
    Anakwenze, Oke
    Klifto, Christopher
    SHOULDER & ELBOW, 2024, 16 (04) : 429 - 435
  • [9] ChatGPT's ability to comprehend and answer cirrhosis related questions in Arabic
    Samaan, Jamil S.
    Yeo, Yee Hui
    Ng, Wee Han
    Ting, Peng-Sheng
    Trivedi, Hirsh
    Vipani, Aarshi
    Yang, Ju Dong
    Liran, Omer
    Spiegel, Brennan
    Kuo, Alexander
    Ayoub, Walid S.
    ARAB JOURNAL OF GASTROENTEROLOGY, 2023, 24 (03) : 145 - 148
  • [10] ChatGPT's ability to comprehend and answer cirrhosis related questions: Comment
    Daungsupawong, Hinpetch
    Wiwanitkit, Viroj
    ARAB JOURNAL OF GASTROENTEROLOGY, 2024, 25 (01) : 74 - 74