• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

比较 ChatGPT 和医生在电子病历中对患者皮肤科问题的回复质量。

Comparing the quality of ChatGPT- and physician-generated responses to patients' dermatology questions in the electronic medical record.

机构信息

Department of Dermatology, University of Michigan, Ann Arbor, MI, USA.

出版信息

Clin Exp Dermatol. 2024 Jun 25;49(7):715-718. doi: 10.1093/ced/llad456.

DOI:10.1093/ced/llad456
PMID:38180108
Abstract

BACKGROUND

ChatGPT is a free artificial intelligence (AI)-based natural language processing tool that generates complex responses to inputs from users.

OBJECTIVES

To determine whether ChatGPT is able to generate high-quality responses to patient-submitted questions in the patient portal.

METHODS

Patient-submitted questions and the corresponding responses from their dermatology physician were extracted from the electronic medical record for analysis. The questions were input into ChatGPT (version 3.5) and the outputs extracted for analysis, with manual removal of verbiage pertaining to ChatGPT's inability to provide medical advice. Ten blinded reviewers (seven physicians and three nonphysicians) rated and selected their preference in terms of 'overall quality', 'readability', 'accuracy', 'thoroughness' and 'level of empathy' of the physician- and ChatGPT-generated responses.

RESULTS

Thirty-one messages and responses were analysed. Physician-generated responses were vastly preferred over the ChatGPT -responses by the physician and nonphysician reviewers and received significantly higher ratings for 'readability' and 'level of empathy'.

CONCLUSIONS

The results of this study suggest that physician-generated responses to patients' portal messages are still preferred over ChatGPT, but generative AI tools may be helpful in generating the first drafts of responses and providing information on education resources for patients.

摘要

背景

ChatGPT 是一种免费的人工智能(AI)自然语言处理工具,能够根据用户输入生成复杂的回复。

目的

确定 ChatGPT 是否能够针对患者门户中患者提交的问题生成高质量的回复。

方法

从电子病历中提取患者提交的问题及其皮肤科医生的相应回复进行分析。将问题输入 ChatGPT(版本 3.5)并提取输出进行分析,手动删除与 ChatGPT 无法提供医疗建议相关的内容。十名盲审员(七名医生和三名非医生)对医生和 ChatGPT 生成的回复的“整体质量”、“可读性”、“准确性”、“全面性”和“同理心水平”进行评分并选择他们的偏好。

结果

分析了 31 条消息和回复。医生生成的回复比 ChatGPT 生成的回复更受医生和非医生审阅者的青睐,并且在“可读性”和“同理心水平”方面获得了更高的评分。

结论

本研究结果表明,医生对患者门户消息的回复仍然比 ChatGPT 更受患者欢迎,但生成式 AI 工具可能有助于生成回复的初稿,并为患者提供教育资源信息。

相似文献

1
Comparing the quality of ChatGPT- and physician-generated responses to patients' dermatology questions in the electronic medical record.比较 ChatGPT 和医生在电子病历中对患者皮肤科问题的回复质量。
Clin Exp Dermatol. 2024 Jun 25;49(7):715-718. doi: 10.1093/ced/llad456.
2
Leveraging large language models for generating responses to patient messages-a subjective analysis.利用大型语言模型生成对患者信息的回复——主观分析。
J Am Med Inform Assoc. 2024 May 20;31(6):1367-1379. doi: 10.1093/jamia/ocae052.
3
"Doctor ChatGPT, Can You Help Me?" The Patient's Perspective: Cross-Sectional Study.“医生 ChatGPT,你能帮我吗?”患者视角:横断面研究。
J Med Internet Res. 2024 Oct 1;26:e58831. doi: 10.2196/58831.
4
Generative artificial intelligence chatbots may provide appropriate informational responses to common vascular surgery questions by patients.生成式人工智能聊天机器人可能会为患者关于常见血管外科问题提供恰当的信息性回复。
Vascular. 2025 Feb;33(1):229-237. doi: 10.1177/17085381241240550. Epub 2024 Mar 18.
5
ChatGPT vs. neurologists: a cross-sectional study investigating preference, satisfaction ratings and perceived empathy in responses among people living with multiple sclerosis.ChatGPT 与神经科医生:一项横断面研究,调查多发性硬化症患者对偏好、满意度评分和感知同理心的反应。
J Neurol. 2024 Jul;271(7):4057-4066. doi: 10.1007/s00415-024-12328-x. Epub 2024 Apr 3.
6
Quality of Answers of Generative Large Language Models Versus Peer Users for Interpreting Laboratory Test Results for Lay Patients: Evaluation Study.生成式大语言模型与同行用户对解释非专业患者实验室检测结果的答案质量比较:评估研究。
J Med Internet Res. 2024 Apr 17;26:e56655. doi: 10.2196/56655.
7
Assessing Artificial Intelligence-Generated Responses to Urology Patient In-Basket Messages.评估人工智能对泌尿科患者收件箱信息的回复。
Urol Pract. 2024 Sep;11(5):793-798. doi: 10.1097/UPJ.0000000000000637. Epub 2024 Jun 24.
8
A Multidisciplinary Assessment of ChatGPT's Knowledge of Amyloidosis: Observational Study.对ChatGPT关于淀粉样变性知识的多学科评估:观察性研究。
JMIR Cardio. 2024 Apr 19;8:e53421. doi: 10.2196/53421.
9
Performance of ChatGPT on the Chinese Postgraduate Examination for Clinical Medicine: Survey Study.ChatGPT 在临床医学研究生入学考试中的表现:调查研究。
JMIR Med Educ. 2024 Feb 9;10:e48514. doi: 10.2196/48514.
10
ChatGPT's performance in German OB/GYN exams - paving the way for AI-enhanced medical education and clinical practice.ChatGPT在德国妇产科考试中的表现——为人工智能强化医学教育和临床实践铺平道路。
Front Med (Lausanne). 2023 Dec 13;10:1296615. doi: 10.3389/fmed.2023.1296615. eCollection 2023.

引用本文的文献

1
The Role of ChatGPT in Dermatology Diagnostics.ChatGPT在皮肤病诊断中的作用。
Diagnostics (Basel). 2025 Jun 16;15(12):1529. doi: 10.3390/diagnostics15121529.
2
Comparison of Multiple State-of-the-Art Large Language Models for Patient Education Prior to CT and MRI Examinations.CT和MRI检查前用于患者教育的多种先进大语言模型的比较
J Pers Med. 2025 Jun 5;15(6):235. doi: 10.3390/jpm15060235.
3
Assessing the Impact of ChatGPT in Dermatology: A Comprehensive Rapid Review.评估ChatGPT在皮肤科的影响:一项全面的快速综述。
J Clin Med. 2024 Oct 3;13(19):5909. doi: 10.3390/jcm13195909.
4
Advancing Psoriasis Care through Artificial Intelligence: A Comprehensive Review.通过人工智能推进银屑病护理:全面综述
Curr Dermatol Rep. 2024 Sep;13(3):141-147. doi: 10.1007/s13671-024-00434-y. Epub 2024 Jun 13.
5
Generative Large Language Models in Electronic Health Records for Patient Care Since 2023: A Systematic Review.2023年以来电子健康记录中用于患者护理的生成式大语言模型:一项系统综述
medRxiv. 2024 Aug 19:2024.08.11.24311828. doi: 10.1101/2024.08.11.24311828.
6
Natural language processing in dermatology: A systematic literature review and state of the art.皮肤科自然语言处理:系统文献回顾与现状
J Eur Acad Dermatol Venereol. 2024 Dec;38(12):2225-2234. doi: 10.1111/jdv.20286. Epub 2024 Aug 16.