• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

被误导的人工智能:临床模型中如何嵌入种族偏见。

Misguided Artificial Intelligence: How Racial Bias is Built Into Clinical Models.

作者信息

Jindal Atin

机构信息

Division of Hospital Medicine The Miriam Hospital, Lifespan Health System, Warren Alpert Brown School of Medicine, Providence, RI.

出版信息

Brown J Hosp Med. 2022 Sep 5;2(1):38021. doi: 10.56305/001c.38021. eCollection 2023.

DOI:10.56305/001c.38021
PMID:40046549
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11878858/
Abstract

Artificial Intelligence is being used today to solve a myriad of problems. While there is significant promise that AI can help us address many healthcare issues, there is also concern that health inequities can be exacerbated. This article looks specifically at predictive models in regards to racial bias. Each phase of the model building process including raw data collection and processing, data labelling, and implementation of the model can be subject to racial bias. This article aims to explore some of the ways in which this occurs.

摘要

如今,人工智能正被用于解决诸多问题。虽然人工智能极有可能帮助我们解决许多医疗保健问题,但也有人担心健康不平等问题可能会加剧。本文专门探讨预测模型中的种族偏见问题。模型构建过程的每个阶段,包括原始数据收集与处理、数据标注以及模型实施,都可能存在种族偏见。本文旨在探讨这种情况出现的一些方式。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1050/11878858/f1260f7c0e97/bhm_2023_2_1_38021_98436.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1050/11878858/f1260f7c0e97/bhm_2023_2_1_38021_98436.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1050/11878858/f1260f7c0e97/bhm_2023_2_1_38021_98436.jpg

相似文献

1
Misguided Artificial Intelligence: How Racial Bias is Built Into Clinical Models.被误导的人工智能:临床模型中如何嵌入种族偏见。
Brown J Hosp Med. 2022 Sep 5;2(1):38021. doi: 10.56305/001c.38021. eCollection 2023.
2
Call for algorithmic fairness to mitigate amplification of racial biases in artificial intelligence models used in orthodontics and craniofacial health.呼吁算法公平性以减轻在口腔正畸学和颅面健康中使用的人工智能模型中种族偏见的放大。
Orthod Craniofac Res. 2023 Dec;26 Suppl 1:124-130. doi: 10.1111/ocr.12721. Epub 2023 Oct 17.
3
Artificial Intelligence to Promote Racial and Ethnic Cardiovascular Health Equity.人工智能促进种族和族裔心血管健康公平。
Curr Cardiovasc Risk Rep. 2024 Nov;18(11):153-162. doi: 10.1007/s12170-024-00745-6. Epub 2024 Aug 20.
4
Artificial intelligence in gastroenterology and hepatology: how to advance clinical practice while ensuring health equity.人工智能在胃肠病学和肝脏病学中的应用:在确保卫生公平的同时如何推进临床实践。
Gut. 2022 Sep;71(9):1909-1915. doi: 10.1136/gutjnl-2021-326271. Epub 2022 Jun 10.
5
The bias algorithm: how AI in healthcare exacerbates ethnic and racial disparities - a scoping review.偏差算法:医疗保健领域的人工智能如何加剧种族和民族差异——一项范围综述。
Ethn Health. 2025 Feb;30(2):197-214. doi: 10.1080/13557858.2024.2422848. Epub 2024 Nov 3.
6
Building competency in artificial intelligence and bias mitigation for nurse scientists and aligned health researchers.培养护士科学家和相关健康研究人员在人工智能及减轻偏差方面的能力。
Nurs Outlook. 2025 May-Jun;73(3):102395. doi: 10.1016/j.outlook.2025.102395. Epub 2025 May 2.
7
Utilizing Artificial Intelligence to Enhance Health Equity Among Patients with Heart Failure.利用人工智能提高心力衰竭患者的健康公平性。
Heart Fail Clin. 2022 Apr;18(2):259-273. doi: 10.1016/j.hfc.2021.11.001. Epub 2022 Mar 4.
8
Evaluating machine learning model bias and racial disparities in non-small cell lung cancer using SEER registry data.利用监测、流行病学和最终结果(SEER)登记数据评估非小细胞肺癌中机器学习模型的偏差和种族差异。
Health Care Manag Sci. 2024 Dec;27(4):631-649. doi: 10.1007/s10729-024-09691-6. Epub 2024 Nov 4.
9
Bridging Health Disparities in the Data-Driven World of Artificial Intelligence: A Narrative Review.在数据驱动的人工智能世界中弥合健康差距:一项叙述性综述
J Racial Ethn Health Disparities. 2024 Jul 2. doi: 10.1007/s40615-024-02057-2.
10
Human-Centered Design to Address Biases in Artificial Intelligence.以人为中心的设计来解决人工智能中的偏见。
J Med Internet Res. 2023 Mar 24;25:e43251. doi: 10.2196/43251.

引用本文的文献

1
Role and Use of Race in Artificial Intelligence and Machine Learning Models Related to Health.种族在与健康相关的人工智能和机器学习模型中的作用及应用
J Med Internet Res. 2025 Jul 31;27:e73996. doi: 10.2196/73996.

本文引用的文献

1
The Risk of Coding Racism into Pediatric Sepsis Care: The Necessity of Antiracism in Machine Learning.将种族主义编入儿童脓毒症护理的风险:机器学习中反种族主义的必要性。
J Pediatr. 2022 Aug;247:129-132. doi: 10.1016/j.jpeds.2022.04.024. Epub 2022 Apr 22.
2
Utilizing Artificial Intelligence to Enhance Health Equity Among Patients with Heart Failure.利用人工智能提高心力衰竭患者的健康公平性。
Heart Fail Clin. 2022 Apr;18(2):259-273. doi: 10.1016/j.hfc.2021.11.001. Epub 2022 Mar 4.
3
Health inequities and the inappropriate use of race in nephrology.
健康不公平和肾脏病学中种族的不当使用。
Nat Rev Nephrol. 2022 Feb;18(2):84-94. doi: 10.1038/s41581-021-00501-8. Epub 2021 Nov 8.
4
Measuring Structural Racism: A Guide for Epidemiologists and Other Health Researchers.测量结构性种族主义:流行病学学家和其他健康研究人员指南。
Am J Epidemiol. 2022 Mar 24;191(4):539-547. doi: 10.1093/aje/kwab239.
5
Ethical Machine Learning in Healthcare.医疗保健中的伦理机器学习。
Annu Rev Biomed Data Sci. 2021 Jul;4:123-144. doi: 10.1146/annurev-biodatasci-092820-114757. Epub 2021 May 6.
6
Bias and fairness assessment of a natural language processing opioid misuse classifier: detection and mitigation of electronic health record data disadvantages across racial subgroups.自然语言处理阿片类药物滥用分类器的偏差和公平性评估:检测和减轻电子健康记录数据在不同种族亚组中的劣势。
J Am Med Inform Assoc. 2021 Oct 12;28(11):2393-2403. doi: 10.1093/jamia/ocab148.
7
Artificial intelligence, bias, and patients' perspectives.人工智能、偏见与患者视角。
Lancet. 2021 May 29;397(10289):2038. doi: 10.1016/S0140-6736(21)01152-1.
8
Unintentional consequences of artificial intelligence in dermatology for patients with skin of colour.人工智能在皮肤科对有色人种皮肤患者产生的意外后果。
Clin Exp Dermatol. 2021 Oct;46(7):1333-1334. doi: 10.1111/ced.14726. Epub 2021 May 31.
9
The risk of racial bias while tracking influenza-related content on social media using machine learning.使用机器学习追踪社交媒体上与流感相关内容时存在种族偏见的风险。
J Am Med Inform Assoc. 2021 Mar 18;28(4):839-849. doi: 10.1093/jamia/ocaa326.
10
Racialized algorithms for kidney function: Erasing social experience.种族算法与肾功能:抹去社会经验。
Soc Sci Med. 2021 Jan;268:113548. doi: 10.1016/j.socscimed.2020.113548. Epub 2020 Nov 23.