Can artificial intelligence (AI) educate your patient? A study to assess overall readability and pharmacists' perception of AI-generated patient education materials

被引:1
|
作者
Armstrong, Drew [1 ]
Paul, Caroline [2 ]
McGlaughlin, Brent [1 ]
Hill, David [1 ]
机构
[1] Reg One Hlth, 877 Jefferson Ave, Memphis, TN 38103 USA
[2] UTHSC Coll Pharm, Memphis, TN USA
关键词
artificial intelligence; disease; education; patient education; pharmaceutical preparations; REPORTING QUALITATIVE RESEARCH;
D O I
10.1002/jac5.2006
中图分类号
R9 [药学];
学科分类号
1007 ;
摘要
Introduction: Pharmacists are critical in providing safe and accurate education to patients on disease states and medications. Artificial intelligence (AI) has the capacity to generate patient education materials at a rapid rate, potentially saving healthcare resources. However, overall accuracy and comfort with these materials by pharmacists need to be assessed. Objective: The purpose of this study was to assess the accuracy, readability, and likelihood of using AI-generated patient education materials for ten common medications and disease states. Methods: AI (Chat Generative Pre-Trained Transformer [ChatGPT] v3.5) was used to create patient education materials for the following medications or disease states: apixaban, Continuous Glucose Monitoring (CGM), the Dietary Approaches to Stop Hypertension (DASH) Diet, enoxaparin, hypertension, hypoglycemia, myocardial infarction, naloxone, semaglutide, and warfarin. The following prompt, "Write a patient education material for & mldr;" with these medications or disease states being at the end of the prompt, was entered into the ChatGPT (OpenAI, San Francisco, CA) software. A similar prompt, "Write a patient education material for & mldr;at a 6th-grade reading level or lower" using the same medications and disease states, was then completed. Ten clinical pharmacists were asked to review and assess the time it took them to review each educational material, make clinical and grammatical edits, their confidence in the clinical accuracy of the materials, and the likelihood that they would use them with their patients. These education materials were assessed for readability using the Flesh-Kincaid readability score. Results: A total of 8 pharmacists completed both sets of reviews for a total of 16 patient education materials assessed. There was no statistical difference in any pharmacist assessment completed between the two prompts. The overall confidence in accuracy was fair, and the overall readability score of the AI-generated materials decreased from 11.65 to 5.87 after reviewing the 6th-grade prompt (p < .001). Conclusion: AI-generated patient education materials show promise in clinical practice, however further validation of their clinical accuracy continues to be a burden. It is important to ensure that overall readability for patient education materials is at an appropriate level to increase the likelihood of patient understanding.
引用
收藏
页码:803 / 808
页数:6
相关论文
共 33 条
  • [1] Can Artificial Intelligence Improve the Readability of Patient Education Materials?
    Kirchner, Gregory J.
    Kim, Raymond Y.
    Weddle, John B.
    Bible, Jesse E.
    CLINICAL ORTHOPAEDICS AND RELATED RESEARCH, 2023, 481 (11) : 2260 - 2267
  • [2] Optimizing Ophthalmology Patient Education via ChatBot-Generated Materials: Readability Analysis of AI-Generated Patient Education Materials and The American Society of Ophthalmic Plastic and Reconstructive Surgery Patient Brochures
    Eid, Kevin
    Eid, Alen
    Wang, Diane
    Raiker, Rahul S.
    Chen, Stephen
    Nguyen, John
    OPHTHALMIC PLASTIC AND RECONSTRUCTIVE SURGERY, 2024, 40 (02): : 212 - 216
  • [3] Can generative AI improve the readability of patient education materials at a radiology practice?
    Gupta, M.
    Gupta, P.
    Ho, C.
    Wood, J.
    Guleria, S.
    Virostko, J.
    CLINICAL RADIOLOGY, 2024, 79 (11) : e1366 - e1371
  • [4] Can Artificial Intelligence Improve the Readability of Patient Education Materials on Aortic Stenosis? A Pilot Study
    Rouhi, Armaun D.
    Ghanem, Yazid K.
    Yolchieva, Laman
    Saleh, Zena
    Joshi, Hansa
    Moccia, Matthew C.
    Suarez-Pierre, Alejandro
    Han, Jason J.
    CARDIOLOGY AND THERAPY, 2024, 13 (01) : 137 - 147
  • [5] Can Artificial Intelligence Improve the Readability of Patient Education Materials on Aortic Stenosis? A Pilot Study
    Armaun D. Rouhi
    Yazid K. Ghanem
    Laman Yolchieva
    Zena Saleh
    Hansa Joshi
    Matthew C. Moccia
    Alejandro Suarez-Pierre
    Jason J. Han
    Cardiology and Therapy, 2024, 13 : 137 - 147
  • [6] Evaluating AI-generated patient education materials for spinal surgeries: Comparative analysis of readability and DISCERN quality across ChatGPT and deepseek models
    Zhou, Mi
    Pan, Yun
    Zhang, Yuye
    Song, Xiaomei
    Zhou, Youbin
    INTERNATIONAL JOURNAL OF MEDICAL INFORMATICS, 2025, 198
  • [7] Assessing Readability of Patient Education Materials: A Comparative Study of ASRS Resources and AI-Generated Content by Popular Large Language Models (ChatGPT 4.0 and Google Bard)
    Shi, Michael
    Hanna, Jovana
    Clavell, Christine
    Eid, Kevin
    Eid, Alen
    Ghorayeb, Ghassan
    John Nguyen
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2024, 65 (07)
  • [9] Integrating Artificial Intelligence (AI) Simulations Into Undergraduate Nursing Education: An Evolving AI Patient
    Lebo, Chelsea
    Brown, Norma
    NURSING EDUCATION PERSPECTIVES, 2024, 45 (01) : 55 - 56
  • [10] The Use of Artificial Intelligence to Improve Readability of Otolaryngology Patient Education Materials
    Patel, Evan A.
    Fleischer, Lindsay
    Filip, Peter
    Eggerstedt, Michael
    Hutz, Michael
    Michaelides, Elias
    Batra, Pete S.
    Tajudeen, Bobby A.
    OTOLARYNGOLOGY-HEAD AND NECK SURGERY, 2024, 171 (02) : 603 - 608