Division of Pulmonary and Critical Care Medicine, Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, Michigan, USA; email:
Weil Institute for Critical Care Research and Innovation, Ann Arbor, Michigan, USA; email:
Annu Rev Med. 2023 Jan 27;74:401-412. doi: 10.1146/annurev-med-043021-024004. Epub 2022 Jul 28.
Understanding how biases originate in medical technologies and developing safeguards to identify, mitigate, and remove their harms are essential to ensuring equal performance in all individuals. Drawing upon examples from pulmonary medicine, this article describes how bias can be introduced in the physical aspects of the technology design, via unrepresentative data, or by conflation of biological with social determinants of health. It then can be perpetuated by inadequate evaluation and regulatory standards. Research demonstrates that pulse oximeters perform differently depending on patient race and ethnicity. Pulmonary function testing and algorithms used to predict healthcare needs are two additional examples of medical technologies with racial and ethnic biases that may perpetuate health disparities.
了解医学技术中的偏见是如何产生的,并制定保障措施来识别、减轻和消除它们的危害,对于确保所有个体的平等表现至关重要。本文以肺病学为例,描述了偏见如何通过技术设计的物理方面、不具代表性的数据或健康的生物决定因素与社会决定因素的混淆而引入;然后,它可能会由于评估和监管标准不足而持续存在。研究表明,脉搏血氧仪的性能因患者的种族和民族而异。肺功能测试和用于预测医疗保健需求的算法是另外两个具有种族和民族偏见的医疗技术的例子,它们可能会延续健康差距。