共 50 条
Biases in Artificial Intelligence Application in Pain Medicine
被引:0
|作者:
Jumreornvong, Oranicha
[1
]
Perez, Aliza M.
[1
]
Malave, Brian
[1
]
Mozawalla, Fatimah
[1
]
Kia, Arash
[2
]
Nwaneshiudu, Chinwe A.
[2
,3
]
机构:
[1] Icahn Sch Med Mt Sinai, Dept Human Performance & Rehabil, New York, NY 10029 USA
[2] Icahn Sch Med Mt Sinai, Dept Anesthesiol Perioperat & Pain Med, New York, NY USA
[3] Icahn Sch Med Mt Sinai, Ctr Dis Neurogenom, New York, NY USA
来源:
关键词:
pain;
artificial intelligence;
biases;
race;
gender;
socioeconomic status;
statistical biases;
D O I:
10.2147/JPR.S495934
中图分类号:
R74 [神经病学与精神病学];
学科分类号:
摘要:
Artificial Intelligence (AI) has the potential to optimize personalized treatment tools and enhance clinical decision-making. However, biases in AI, arising from sex, race, socioeconomic status (SES), and statistical methods, can exacerbate disparities in pain management. This narrative review examines these biases and proposes strategies to mitigate them. A comprehensive literature search across databases such as PubMed, Google Scholar, and PsycINFO focused on AI applications in pain management and sources of biases. Sex and racial biases often stem from societal stereotypes, underrepresentation of females, overrepresentation of European ancestry patients in clinical trials, and unequal access to treatment caused by systemic racism, leading to inaccurate pain assessments and misrepresentation in clinical data. SES biases reflect differential access to healthcare resources and incomplete data for lower SES individuals, resulting in larger prediction errors. Statistical biases, including sampling and measurement biases, further affect the reliability of AI algorithms. To ensure equitable healthcare delivery, this review recommends employing specific fairness-aware techniques such as reweighting algorithms, adversarial debiasing, and other methods that adjust training data to minimize bias. Additionally, leveraging diverse perspectives-including insights from patients, clinicians, policymakers, and interdisciplinary collaborators-can enhance the development of fair and interpretable AI systems. Continuous monitoring and inclusive collaboration are essential for addressing biases and harnessing AI's potential to improve pain management outcomes across diverse populations.
引用
收藏
页码:1021 / 1033
页数:13
相关论文