• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

算法鸿沟:关于人工智能驱动的医疗保健领域种族差异的系统综述

The Algorithmic Divide: A Systematic Review on AI-Driven Racial Disparities in Healthcare.

作者信息

Haider Syed Ali, Borna Sahar, Gomez-Cabello Cesar A, Pressman Sophia M, Haider Clifton R, Forte Antonio Jorge

机构信息

Division of Plastic Surgery, Mayo Clinic, 4500 San Pablo Rd, Jacksonville, FL, 32224, USA.

Department of Physiology and Biomedical Engineering, Mayo Clinic, Rochester, MN, USA.

出版信息

J Racial Ethn Health Disparities. 2024 Dec 18. doi: 10.1007/s40615-024-02237-0.

DOI:10.1007/s40615-024-02237-0
PMID:39695057
Abstract

INTRODUCTION

As artificial intelligence (AI) continues to permeate various sectors, concerns about disparities arising from its deployment have surfaced. AI's effectiveness correlates not only with the algorithm's quality but also with its training data's integrity. This systematic review investigates the racial disparities perpetuated by AI systems across diverse medical domains and the implications of deploying them, particularly in healthcare.

METHODS

Six electronic databases (PubMed, Scopus, IEEE, Google Scholar, EMBASE, and Cochrane) were systematically searched on October 3, 2023. Inclusion criteria were peer-reviewed articles in English from 2013 to 2023 that examined instances of racial bias perpetuated by AI in healthcare. Studies conducted outside of healthcare settings or that addressed biases other than racial, as well as letters, opinions were excluded. The risk of bias was identified using CASP criteria for reviews and the Modified Newcastle Scale for observational studies.

RESULTS

Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, 1272 articles were initially identified, from which 26 met eligibility criteria. Four articles were identified via snowballing, resulting in 30 articles in the analysis. Studies indicate a significant association between AI utilization and the exacerbation of racial disparities, especially in minority populations, including Blacks and Hispanics. Biased data, algorithm design, unfair deployment of algorithms, and historic/systemic inequities were identified as the causes. Study limitations stem from heterogeneity impeding broad comparisons and the preclusion of meta-analysis.

CONCLUSION

To address racial disparities in healthcare outcomes, enhanced ethical considerations and regulatory frameworks are needed in AI healthcare applications. Comprehensive bias detection tools and mitigation strategies, coupled with active supervision by physicians, are essential to ensure AI becomes a tool for reducing racial disparities in healthcare outcomes.

摘要

引言

随着人工智能(AI)不断渗透到各个领域,人们对其应用所产生的差异问题日益关注。人工智能的有效性不仅与算法质量相关,还与其训练数据的完整性有关。本系统综述调查了人工智能系统在不同医学领域造成的种族差异及其应用的影响,特别是在医疗保健领域。

方法

于2023年10月3日对六个电子数据库(PubMed、Scopus、IEEE、谷歌学术、EMBASE和Cochrane)进行了系统检索。纳入标准为2013年至2023年期间发表的英文同行评审文章,这些文章研究了人工智能在医疗保健中延续种族偏见的实例。排除在医疗保健环境之外进行的研究、涉及种族以外其他偏见的研究以及信件、观点类文章。使用CASP综述标准和观察性研究的改良纽卡斯尔量表来识别偏倚风险。

结果

按照系统评价和Meta分析的首选报告项目(PRISMA)指南,初步识别出1272篇文章,其中26篇符合纳入标准。通过滚雪球法又识别出4篇文章,最终纳入分析的文章有30篇。研究表明,人工智能的应用与种族差异的加剧之间存在显著关联,尤其是在包括黑人和西班牙裔在内的少数族裔群体中。有偏见的数据、算法设计、算法的不公平应用以及历史/系统性不平等被确定为造成这种情况的原因。研究的局限性在于异质性阻碍了广泛的比较,且无法进行Meta分析。

结论

为了解决医疗保健结果中的种族差异问题,人工智能在医疗保健应用中需要加强伦理考量和监管框架。全面的偏见检测工具和缓解策略,再加上医生的积极监督,对于确保人工智能成为减少医疗保健结果中种族差异的工具至关重要。

相似文献

1
The Algorithmic Divide: A Systematic Review on AI-Driven Racial Disparities in Healthcare.算法鸿沟:关于人工智能驱动的医疗保健领域种族差异的系统综述
J Racial Ethn Health Disparities. 2024 Dec 18. doi: 10.1007/s40615-024-02237-0.
2
Folic acid supplementation and malaria susceptibility and severity among people taking antifolate antimalarial drugs in endemic areas.在流行地区,服用抗叶酸抗疟药物的人群中,叶酸补充剂与疟疾易感性和严重程度的关系。
Cochrane Database Syst Rev. 2022 Feb 1;2(2022):CD014217. doi: 10.1002/14651858.CD014217.
3
Bridging Health Disparities in the Data-Driven World of Artificial Intelligence: A Narrative Review.在数据驱动的人工智能世界中弥合健康差距:一项叙述性综述
J Racial Ethn Health Disparities. 2024 Jul 2. doi: 10.1007/s40615-024-02057-2.
4
Unmasking bias in artificial intelligence: a systematic review of bias detection and mitigation strategies in electronic health record-based models.揭开人工智能中的偏见:基于电子健康记录模型的偏见检测和缓解策略的系统评价。
J Am Med Inform Assoc. 2024 Apr 19;31(5):1172-1183. doi: 10.1093/jamia/ocae060.
5
Artificial Intelligence in Thoracic Surgery: A Review Bridging Innovation and Clinical Practice for the Next Generation of Surgical Care.胸外科中的人工智能:一篇将创新与下一代外科护理临床实践相联系的综述
J Clin Med. 2025 Apr 16;14(8):2729. doi: 10.3390/jcm14082729.
6
Unmasking bias in artificial intelligence: a systematic review of bias detection and mitigation strategies in electronic health record-based models.揭示人工智能中的偏见:基于电子健康记录模型的偏见检测与缓解策略的系统评价
ArXiv. 2024 Jul 1:arXiv:2310.19917v3.
7
Inherent Bias in Electronic Health Records: A Scoping Review of Sources of Bias.电子健康记录中的固有偏差:偏差来源的范围综述
medRxiv. 2024 Apr 12:2024.04.09.24305594. doi: 10.1101/2024.04.09.24305594.
8
The Impact of Artificial Intelligence on Health Equity in Oncology: Scoping Review.人工智能对肿瘤学中健康公平性的影响:范围综述。
J Med Internet Res. 2022 Nov 1;24(11):e39748. doi: 10.2196/39748.
9
Beyond the black stump: rapid reviews of health research issues affecting regional, rural and remote Australia.超越黑木树:影响澳大利亚地区、农村和偏远地区的健康研究问题的快速综述。
Med J Aust. 2020 Dec;213 Suppl 11:S3-S32.e1. doi: 10.5694/mja2.50881.
10
Artificial intelligence for breast cancer detection and its health technology assessment: A scoping review.用于乳腺癌检测的人工智能及其健康技术评估:一项范围综述。
Comput Biol Med. 2025 Jan;184:109391. doi: 10.1016/j.compbiomed.2024.109391. Epub 2024 Nov 22.

引用本文的文献

1
A human rights approach to preventing racial discrimination in health care.一种从人权角度预防医疗保健领域种族歧视的方法。
Bull World Health Organ. 2025 Sep 1;103(9):574-576. doi: 10.2471/BLT.25.293305. Epub 2025 Aug 13.
2
Targeting everyday decision makers in research: early career researcher and patient and public involvement and engagement collaboration in an AI-in-healthcare project.针对研究中的日常决策者:早期职业研究人员与患者及公众参与和参与人工智能医疗保健项目的合作。
Res Involv Engagem. 2025 Aug 19;11(1):100. doi: 10.1186/s40900-025-00753-9.
3
Advancements in artificial intelligence transforming medical education: a comprehensive overview.

本文引用的文献

1
AI and Ethics: A Systematic Review of the Ethical Considerations of Large Language Model Use in Surgery Research.人工智能与伦理学:对手术研究中使用大语言模型的伦理考量的系统综述
Healthcare (Basel). 2024 Apr 13;12(8):825. doi: 10.3390/healthcare12080825.
2
Using artificial intelligence on dermatology conditions in Uganda: a case for diversity in training data sets for machine learning.利用人工智能研究乌干达皮肤病状况:机器学习训练数据集多样性的一个案例。
Afr Health Sci. 2023 Jun;23(2):753-763. doi: 10.4314/ahs.v23i2.86.
3
Disparities in Travel-Related Barriers to Accessing Health Care From the 2017 National Household Travel Survey.
人工智能在医学教育中的进展:全面概述
Med Educ Online. 2025 Dec;30(1):2542807. doi: 10.1080/10872981.2025.2542807. Epub 2025 Aug 12.
4
AI-driven approaches in the management of early childhood caries: A path toward global oral health.人工智能驱动的幼儿龋齿管理方法:通往全球口腔健康之路。
J Oral Biol Craniofac Res. 2025 Sep-Oct;15(5):1134-1140. doi: 10.1016/j.jobcr.2025.07.022. Epub 2025 Jul 29.
5
Synthetic Patient-Physician Conversations Simulated by Large Language Models: A Multi-Dimensional Evaluation.由大语言模型模拟的合成医患对话:多维评估
Sensors (Basel). 2025 Jul 10;25(14):4305. doi: 10.3390/s25144305.
6
The illusion of safety: A report to the FDA on AI healthcare product approvals.安全错觉:提交给美国食品药品监督管理局的关于人工智能医疗产品审批的报告。
PLOS Digit Health. 2025 Jun 5;4(6):e0000866. doi: 10.1371/journal.pdig.0000866. eCollection 2025 Jun.
7
Facial Analysis for Plastic Surgery in the Era of Artificial Intelligence: A Comparative Evaluation of Multimodal Large Language Models.人工智能时代整形外科的面部分析:多模态大语言模型的比较评估
J Clin Med. 2025 May 16;14(10):3484. doi: 10.3390/jcm14103484.
8
A Validity Analysis of Text-to-Image Generative Artificial Intelligence Models for Craniofacial Anatomy Illustration.用于颅面解剖学插图的文本到图像生成式人工智能模型的有效性分析
J Clin Med. 2025 Mar 21;14(7):2136. doi: 10.3390/jcm14072136.
9
Breaking Bones, Breaking Barriers: ChatGPT, DeepSeek, and Gemini in Hand Fracture Management.突破骨骼,突破障碍:ChatGPT、DeepSeek和Gemini在手部骨折管理中的应用
J Clin Med. 2025 Mar 14;14(6):1983. doi: 10.3390/jcm14061983.
10
A critical look into artificial intelligence and healthcare disparities.审视人工智能与医疗保健差距
Front Artif Intell. 2025 Mar 6;8:1545869. doi: 10.3389/frai.2025.1545869. eCollection 2025.
2017 年美国家庭旅行调查:旅行相关障碍导致获得医疗保健服务的机会不平等。
JAMA Netw Open. 2023 Jul 3;6(7):e2325291. doi: 10.1001/jamanetworkopen.2023.25291.
4
Addressing ethnic and global health inequalities in the era of artificial intelligence healthcare models: a call for responsible implementation.在人工智能医疗保健模式时代解决种族和全球健康不平等问题:呼吁负责任地实施。
J R Soc Med. 2023 Aug;116(8):260-262. doi: 10.1177/01410768231187734. Epub 2023 Jul 19.
5
Association of Biomarker-Based Artificial Intelligence With Risk of Racial Bias in Retinal Images.基于生物标志物的人工智能与视网膜图像种族偏见风险的关联。
JAMA Ophthalmol. 2023 Jun 1;141(6):543-552. doi: 10.1001/jamaophthalmol.2023.1310.
6
Addressing the Challenge of Biomedical Data Inequality: An Artificial Intelligence Perspective.解决生物医学数据不平等问题:人工智能视角。
Annu Rev Biomed Data Sci. 2023 Aug 10;6:153-171. doi: 10.1146/annurev-biodatasci-020722-020704. Epub 2023 Apr 27.
7
Artificial intelligence for drug discovery: Resources, methods, and applications.用于药物发现的人工智能:资源、方法及应用
Mol Ther Nucleic Acids. 2023 Feb 18;31:691-702. doi: 10.1016/j.omtn.2023.02.019. eCollection 2023 Mar 14.
8
Sources of bias in artificial intelligence that perpetuate healthcare disparities-A global review.导致医疗保健差距长期存在的人工智能偏差来源——一项全球综述。
PLOS Digit Health. 2022 Mar 31;1(3):e0000022. doi: 10.1371/journal.pdig.0000022. eCollection 2022 Mar.
9
A Call to Action on Assessing and Mitigating Bias in Artificial Intelligence Applications for Mental Health.呼吁重视并减轻人工智能应用于精神健康领域中的偏见
Perspect Psychol Sci. 2023 Sep;18(5):1062-1096. doi: 10.1177/17456916221134490. Epub 2022 Dec 9.
10
Computer science has a racism problem: these researchers want to fix it.计算机科学存在种族主义问题:这些研究人员想要解决它。
Nature. 2022 Oct;610(7932):440-443. doi: 10.1038/d41586-022-03251-0.