Biases in Artificial Intelligence Application in Pain Medicine

被引:0
|
作者
Jumreornvong, Oranicha [1 ]
Perez, Aliza M. [1 ]
Malave, Brian [1 ]
Mozawalla, Fatimah [1 ]
Kia, Arash [2 ]
Nwaneshiudu, Chinwe A. [2 ,3 ]
机构
[1] Icahn Sch Med Mt Sinai, Dept Human Performance & Rehabil, New York, NY 10029 USA
[2] Icahn Sch Med Mt Sinai, Dept Anesthesiol Perioperat & Pain Med, New York, NY USA
[3] Icahn Sch Med Mt Sinai, Ctr Dis Neurogenom, New York, NY USA
来源
JOURNAL OF PAIN RESEARCH | 2025年 / 18卷
关键词
pain; artificial intelligence; biases; race; gender; socioeconomic status; statistical biases;
D O I
10.2147/JPR.S495934
中图分类号
R74 [神经病学与精神病学];
学科分类号
摘要
Artificial Intelligence (AI) has the potential to optimize personalized treatment tools and enhance clinical decision-making. However, biases in AI, arising from sex, race, socioeconomic status (SES), and statistical methods, can exacerbate disparities in pain management. This narrative review examines these biases and proposes strategies to mitigate them. A comprehensive literature search across databases such as PubMed, Google Scholar, and PsycINFO focused on AI applications in pain management and sources of biases. Sex and racial biases often stem from societal stereotypes, underrepresentation of females, overrepresentation of European ancestry patients in clinical trials, and unequal access to treatment caused by systemic racism, leading to inaccurate pain assessments and misrepresentation in clinical data. SES biases reflect differential access to healthcare resources and incomplete data for lower SES individuals, resulting in larger prediction errors. Statistical biases, including sampling and measurement biases, further affect the reliability of AI algorithms. To ensure equitable healthcare delivery, this review recommends employing specific fairness-aware techniques such as reweighting algorithms, adversarial debiasing, and other methods that adjust training data to minimize bias. Additionally, leveraging diverse perspectives-including insights from patients, clinicians, policymakers, and interdisciplinary collaborators-can enhance the development of fair and interpretable AI systems. Continuous monitoring and inclusive collaboration are essential for addressing biases and harnessing AI's potential to improve pain management outcomes across diverse populations.
引用
收藏
页码:1021 / 1033
页数:13
相关论文
共 50 条
  • [21] Humans inherit artificial intelligence biases
    Vicente, Lucia
    Matute, Helena
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [22] Principles of artificial intelligence and its application in cardiovascular medicine
    Wieneke, Heinrich
    Voigt, Ingo
    CLINICAL CARDIOLOGY, 2024, 47 (01)
  • [23] The application of artificial intelligence in reproductive medicine: baby steps
    Wei, Daimin
    Legro, Richard S.
    Chen, Zi-Jiang
    FERTILITY AND STERILITY, 2022, 118 (01) : 109 - 110
  • [24] Application of Artificial Intelligence for Precision Medicine in Clinical Sequencing
    Furukawa, Yoichi
    PEDIATRIC BLOOD & CANCER, 2017, 64 : S4 - S5
  • [25] Humans inherit artificial intelligence biases
    Lucía Vicente
    Helena Matute
    Scientific Reports, 13
  • [26] Utilizing artificial intelligence in nuclear medicine: Application and challenges
    Cheng, Chong
    Li, Ping-Ping
    Zhang, Ling
    Tang, Bin
    Tang, Pan
    JOURNAL OF ADVANCED NURSING, 2024,
  • [27] Societal Issues Concerning the Application of Artificial Intelligence in Medicine
    Vellido, Alfredo
    KIDNEY DISEASES, 2019, 5 (01) : 11 - 17
  • [28] Artificial intelligence in medicine
    Choolani, Mahesh
    SINGAPORE MEDICAL JOURNAL, 2024, 65 (03) : 131 - 131
  • [29] Artificial Intelligence in Medicine?
    不详
    OPHTHALMOLOGE, 2019, 116 (08): : 784 - 784
  • [30] Artificial Intelligence in Medicine?
    不详
    GYNAKOLOGISCHE ENDOKRINOLOGIE, 2019, 17 (03): : 138 - 138