Department of Dermatology, University of Michigan, Ann Arbor, MI, USA.
Clin Exp Dermatol. 2024 Jun 25;49(7):715-718. doi: 10.1093/ced/llad456.
ChatGPT is a free artificial intelligence (AI)-based natural language processing tool that generates complex responses to inputs from users.
To determine whether ChatGPT is able to generate high-quality responses to patient-submitted questions in the patient portal.
Patient-submitted questions and the corresponding responses from their dermatology physician were extracted from the electronic medical record for analysis. The questions were input into ChatGPT (version 3.5) and the outputs extracted for analysis, with manual removal of verbiage pertaining to ChatGPT's inability to provide medical advice. Ten blinded reviewers (seven physicians and three nonphysicians) rated and selected their preference in terms of 'overall quality', 'readability', 'accuracy', 'thoroughness' and 'level of empathy' of the physician- and ChatGPT-generated responses.
Thirty-one messages and responses were analysed. Physician-generated responses were vastly preferred over the ChatGPT -responses by the physician and nonphysician reviewers and received significantly higher ratings for 'readability' and 'level of empathy'.
The results of this study suggest that physician-generated responses to patients' portal messages are still preferred over ChatGPT, but generative AI tools may be helpful in generating the first drafts of responses and providing information on education resources for patients.
ChatGPT 是一种免费的人工智能(AI)自然语言处理工具,能够根据用户输入生成复杂的回复。
确定 ChatGPT 是否能够针对患者门户中患者提交的问题生成高质量的回复。
从电子病历中提取患者提交的问题及其皮肤科医生的相应回复进行分析。将问题输入 ChatGPT(版本 3.5)并提取输出进行分析,手动删除与 ChatGPT 无法提供医疗建议相关的内容。十名盲审员(七名医生和三名非医生)对医生和 ChatGPT 生成的回复的“整体质量”、“可读性”、“准确性”、“全面性”和“同理心水平”进行评分并选择他们的偏好。
分析了 31 条消息和回复。医生生成的回复比 ChatGPT 生成的回复更受医生和非医生审阅者的青睐,并且在“可读性”和“同理心水平”方面获得了更高的评分。
本研究结果表明,医生对患者门户消息的回复仍然比 ChatGPT 更受患者欢迎,但生成式 AI 工具可能有助于生成回复的初稿,并为患者提供教育资源信息。