• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

生成式人工智能文本到图像描述药剂师中的性别和种族偏见。

Gender and ethnicity bias in generative artificial intelligence text-to-image depiction of pharmacists.

机构信息

School of Dentistry and Medical Sciences, Charles Sturt University, Wagga Wagga, Australia.

Department of Radiology, Baylor College of Medicine, Houston, United States.

出版信息

Int J Pharm Pract. 2024 Nov 14;32(6):524-531. doi: 10.1093/ijpp/riae049.

DOI:10.1093/ijpp/riae049
PMID:39228085
Abstract

INTRODUCTION

In Australia, 64% of pharmacists are women but continue to be under-represented. Generative artificial intelligence (AI) is potentially transformative but also has the potential for errors, misrepresentations, and bias. Generative AI text-to-image production using DALL-E 3 (OpenAI) is readily accessible and user-friendly but may reinforce gender and ethnicity biases.

METHODS

In March 2024, DALL-E 3 was utilized to generate individual and group images of Australian pharmacists. Collectively, 40 images were produced with DALL-E 3 for evaluation of which 30 were individual characters and the remaining 10 images were comprised of multiple characters (N = 155). All images were independently analysed by two reviewers for apparent gender, age, ethnicity, skin tone, and body habitus. Discrepancies in responses were resolved by third-observer consensus.

RESULTS

Collectively for DALL-E 3, 69.7% of pharmacists were depicted as men, 29.7% as women, 93.5% as a light skin tone, 6.5% as mid skin tone, and 0% as dark skin tone. The gender distribution was a statistically significant variation from that of actual Australian pharmacists (P < .001). Among the images of individual pharmacists, DALL-E 3 generated 100% as men and 100% were light skin tone.

CONCLUSIONS

This evaluation reveals the gender and ethnicity bias associated with generative AI text-to-image generation using DALL-E 3 among Australian pharmacists. Generated images have a disproportionately high representation of white men as pharmacists which is not representative of the diversity of pharmacists in Australia today.

摘要

引言

在澳大利亚,64%的药剂师是女性,但仍未得到充分代表。生成式人工智能(AI)具有变革性的潜力,但也有可能出现错误、失实陈述和偏见。使用 DALL-E 3(OpenAI)的生成式 AI 文本到图像生成功能易于使用,但可能会强化性别和种族偏见。

方法

在 2024 年 3 月,使用 DALL-E 3 生成澳大利亚药剂师的个人和群体图像。使用 DALL-E 3 共生成了 40 张图像,用于评估其中 30 张是个人形象,其余 10 张是由多个角色组成的图像(N=155)。两位评审员独立分析所有图像,以确定其性别、年龄、种族、肤色和体型。对答复意见的分歧由第三方达成共识。

结果

DALL-E 3 生成的药剂师中,69.7%为男性,29.7%为女性,93.5%为浅色肤色,6.5%为中等肤色,0%为深色肤色。这种性别分布与实际的澳大利亚药剂师有显著差异(P<0.001)。在个体药剂师的图像中,DALL-E 3 生成的图像 100%为男性,100%为浅色肤色。

结论

这项评估揭示了使用 DALL-E 3 进行的生成式 AI 文本到图像生成在澳大利亚药剂师中存在的性别和种族偏见。生成的图像中白人男性作为药剂师的代表性过高,这与澳大利亚当今药剂师的多样性不符。

相似文献

1
Gender and ethnicity bias in generative artificial intelligence text-to-image depiction of pharmacists.生成式人工智能文本到图像描述药剂师中的性别和种族偏见。
Int J Pharm Pract. 2024 Nov 14;32(6):524-531. doi: 10.1093/ijpp/riae049.
2
Gender bias in text-to-image generative artificial intelligence depiction of Australian paramedics and first responders.澳大利亚护理人员和急救人员在文本到图像生成式人工智能描绘中的性别偏见。
Australas Emerg Care. 2025 Jun;28(2):103-109. doi: 10.1016/j.auec.2024.11.003. Epub 2024 Dec 2.
3
Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 1: Preliminary Evaluation.医学成像中基于文本生成图像的人工智能的性别和种族偏见,第1部分:初步评估
J Nucl Med Technol. 2024 Dec 4;52(4):356-359. doi: 10.2967/jnmt.124.268332.
4
Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 2: Analysis of DALL-E 3.医学成像中图像生成式人工智能的性别和种族偏见,第2部分:DALL-E 3分析
J Nucl Med Technol. 2024 Oct 22. doi: 10.2967/jnmt.124.268359.
5
Ensuring Appropriate Representation in Artificial Intelligence-Generated Medical Imagery: Protocol for a Methodological Approach to Address Skin Tone Bias.确保人工智能生成的医学图像中有适当的代表性:解决肤色偏差的方法学途径方案。
JMIR AI. 2024 Nov 27;3:e58275. doi: 10.2196/58275.
6
Representation of intensivists' race/ethnicity, sex, and age by artificial intelligence: a cross-sectional study of two text-to-image models.人工智能对重症监护医师种族/民族、性别和年龄的代表性:对两个文本到图像模型的横断面研究。
Crit Care. 2024 Nov 11;28(1):363. doi: 10.1186/s13054-024-05134-4.
7
Artificial Intelligence Portrayals in Orthopaedic Surgery: An Analysis of Gender and Racial Diversity Using Text-to-Image Generators.骨科手术中的人工智能描绘:使用文本到图像生成器对性别和种族多样性的分析
J Bone Joint Surg Am. 2024 Dec 4;106(23):2278-2285. doi: 10.2106/JBJS.24.00150. Epub 2024 Jul 18.
8
Generative Artificial Intelligence Biases, Limitations and Risks in Nuclear Medicine: An Argument for Appropriate Use Framework and Recommendations.核医学中的生成式人工智能偏差、局限性及风险:关于合理使用框架与建议的探讨
Semin Nucl Med. 2025 May;55(3):423-436. doi: 10.1053/j.semnuclmed.2024.05.005. Epub 2024 Jun 8.
9
Beyond the stereotypes: Artificial Intelligence image generation and diversity in anesthesiology.超越刻板印象:麻醉学中的人工智能图像生成与多样性
Front Artif Intell. 2024 Oct 9;7:1462819. doi: 10.3389/frai.2024.1462819. eCollection 2024.
10
Assessment of Generative Artificial Intelligence (AI) Models in Creating Medical Illustrations for Various Corneal Transplant Procedures.生成式人工智能(AI)模型在为各种角膜移植手术创建医学插图方面的评估。
Cureus. 2024 Aug 26;16(8):e67833. doi: 10.7759/cureus.67833. eCollection 2024 Aug.

引用本文的文献

1
Generative AI in healthcare: challenges to patient agency and ethical implications.医疗保健领域的生成式人工智能:对患者自主性的挑战及伦理影响。
Front Digit Health. 2025 Jun 18;7:1524553. doi: 10.3389/fdgth.2025.1524553. eCollection 2025.
2
Artificial intelligence (AI) in psychotherapy: A challenging frontier.心理治疗中的人工智能:一个具有挑战性的前沿领域。
Australas Psychiatry. 2025 Aug;33(4):629-632. doi: 10.1177/10398562251346075. Epub 2025 May 27.
3
Evaluating diversity and stereotypes amongst AI generated representations of healthcare providers.
评估人工智能生成的医疗服务提供者形象中的多样性和刻板印象。
Front Digit Health. 2025 Apr 25;7:1537907. doi: 10.3389/fdgth.2025.1537907. eCollection 2025.