Michael P. Cary Jr. (
Anna Zink, University of Chicago, Chicago, Illinois.
Health Aff (Millwood). 2023 Oct;42(10):1359-1368. doi: 10.1377/hlthaff.2023.00553.
In August 2022 the Department of Health and Human Services (HHS) issued a notice of proposed rulemaking prohibiting covered entities, which include health care providers and health plans, from discriminating against individuals when using clinical algorithms in decision making. However, HHS did not provide specific guidelines on how covered entities should prevent discrimination. We conducted a scoping review of literature published during the period 2011-22 to identify health care applications, frameworks, reviews and perspectives, and assessment tools that identify and mitigate bias in clinical algorithms, with a specific focus on racial and ethnic bias. Our scoping review encompassed 109 articles comprising 45 empirical health care applications that included tools tested in health care settings, 16 frameworks, and 48 reviews and perspectives. We identified a wide range of technical, operational, and systemwide bias mitigation strategies for clinical algorithms, but there was no consensus in the literature on a single best practice that covered entities could employ to meet the HHS requirements. Future research should identify optimal bias mitigation methods for various scenarios, depending on factors such as patient population, clinical setting, algorithm design, and types of bias to be addressed.
2022 年 8 月,美国卫生与公众服务部(HHS)发布了一项拟议规则制定通知,禁止涵盖实体(包括医疗保健提供者和医疗计划)在使用临床算法进行决策时歧视个人。然而,HHS 并未提供涵盖实体应如何防止歧视的具体指导方针。我们对 2011-22 年期间发表的文献进行了范围审查,以确定识别和减轻临床算法中偏差的医疗保健应用、框架、评论和观点以及评估工具,特别关注种族和族裔偏差。我们的范围审查包括 109 篇文章,其中包括在医疗保健环境中测试的工具在内的 45 项实证医疗保健应用、16 个框架以及 48 项评论和观点。我们确定了临床算法的各种技术、操作和系统范围的偏差缓解策略,但文献中没有涵盖实体可以采用的单一最佳实践来满足 HHS 的要求的共识。未来的研究应根据患者群体、临床环境、算法设计和要解决的偏差类型等因素,确定各种情况下的最佳偏差缓解方法。