Biases in Artificial Intelligence Application in Pain Medicine

被引:0
|
作者
Jumreornvong, Oranicha [1 ]
Perez, Aliza M. [1 ]
Malave, Brian [1 ]
Mozawalla, Fatimah [1 ]
Kia, Arash [2 ]
Nwaneshiudu, Chinwe A. [2 ,3 ]
机构
[1] Icahn Sch Med Mt Sinai, Dept Human Performance & Rehabil, New York, NY 10029 USA
[2] Icahn Sch Med Mt Sinai, Dept Anesthesiol Perioperat & Pain Med, New York, NY USA
[3] Icahn Sch Med Mt Sinai, Ctr Dis Neurogenom, New York, NY USA
来源
JOURNAL OF PAIN RESEARCH | 2025年 / 18卷
关键词
pain; artificial intelligence; biases; race; gender; socioeconomic status; statistical biases;
D O I
10.2147/JPR.S495934
中图分类号
R74 [神经病学与精神病学];
学科分类号
摘要
Artificial Intelligence (AI) has the potential to optimize personalized treatment tools and enhance clinical decision-making. However, biases in AI, arising from sex, race, socioeconomic status (SES), and statistical methods, can exacerbate disparities in pain management. This narrative review examines these biases and proposes strategies to mitigate them. A comprehensive literature search across databases such as PubMed, Google Scholar, and PsycINFO focused on AI applications in pain management and sources of biases. Sex and racial biases often stem from societal stereotypes, underrepresentation of females, overrepresentation of European ancestry patients in clinical trials, and unequal access to treatment caused by systemic racism, leading to inaccurate pain assessments and misrepresentation in clinical data. SES biases reflect differential access to healthcare resources and incomplete data for lower SES individuals, resulting in larger prediction errors. Statistical biases, including sampling and measurement biases, further affect the reliability of AI algorithms. To ensure equitable healthcare delivery, this review recommends employing specific fairness-aware techniques such as reweighting algorithms, adversarial debiasing, and other methods that adjust training data to minimize bias. Additionally, leveraging diverse perspectives-including insights from patients, clinicians, policymakers, and interdisciplinary collaborators-can enhance the development of fair and interpretable AI systems. Continuous monitoring and inclusive collaboration are essential for addressing biases and harnessing AI's potential to improve pain management outcomes across diverse populations.
引用
收藏
页码:1021 / 1033
页数:13
相关论文
共 50 条
  • [31] Artificial intelligence in medicine
    Hamet, Pavel
    Tremblay, Johanne
    METABOLISM-CLINICAL AND EXPERIMENTAL, 2017, 69 : S36 - S40
  • [32] Artificial Intelligence in Medicine?
    不详
    ANAESTHESIST, 2019, 68 (08): : 508 - 508
  • [33] Artificial Intelligence in Medicine
    Mahmoudi, Tahereh
    Mehdizadeh, Alireza
    Journal of Biomedical Physics and Engineering, 2022, 12 (06): : 549 - 550
  • [34] artificial intelligence in medicine
    Zheng, Dandan
    JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, 2021, 22 (01): : 355 - 356
  • [35] Artificial intelligence in medicine
    Orniye
    DEUTSCHE MEDIZINISCHE WOCHENSCHRIFT, 2024, 149 (07) : 345 - 345
  • [36] Artificial intelligence and medicine
    Sturm, Christian
    Liebl, Max Emanuel
    Best, Norman
    PHYSIKALISCHE MEDIZIN REHABILITATIONSMEDIZIN KURORTMEDIZIN, 2023, 33 (04) : 193 - 194
  • [37] Is there Artificial Intelligence in Medicine?
    不详
    CHIRURG, 2019, 90 (09): : 757 - 757
  • [38] ARTIFICIAL INTELLIGENCE IN MEDICINE
    Germano, G.
    ANNALS OF THE RHEUMATIC DISEASES, 2015, 74 : 11 - 12
  • [39] Artificial Intelligence in Medicine?
    不详
    ZEITSCHRIFT FUR RHEUMATOLOGIE, 2019, 78 (07): : 626 - 626
  • [40] Artificial intelligence in medicine
    Hekmat, Khosro
    Bruns, Christiane J.
    CHIRURGIE, 2023, 94 (10): : 879 - 880