Suppr超能文献

医生机器人应该具备道德同理心吗?

Should Doctor Robot possess moral empathy?

作者信息

Sirgiovanni Elisabetta

机构信息

Section of History of Medicine and Bioethics, Department of Molecular Medicine, Sapienza University of Rome, Rome, Italy.

出版信息

Bioethics. 2025 Jan;39(1):98-107. doi: 10.1111/bioe.13345. Epub 2024 Aug 24.

Abstract

Critics of clinical artificial intelligence (AI) suggest that the technology is ethically harmful because it may lead to the dehumanization of the doctor-patient relationship (DPR) by eliminating moral empathy, which is viewed as a distinctively human trait. The benefits of clinical empathy-that is, moral empathy applied in the clinical context-are widely praised, but this praise is often unquestioning and lacks context. In this article, I will argue that criticisms of clinical AI based on appeals to empathy are misplaced. As psychological and philosophical research has shown, empathy leads to certain types of biased reasoning and choices. These biases of empathy consistently impact the DPR. Empathy may lead to partial judgments and asymmetric DPRs, as well as disparities in the treatment of patients, undermining respect for patient autonomy and equality. Engineers should consider the flaws of empathy when designing affective artificial systems in the future. The nature of sympathy and compassion (i.e., displaying emotional concern while maintaining some balanced distance) has been defended by some ethicists as more beneficial than perspective-taking in the clinical context. However, these claims do not seem to have impacted the AI debate. Thus, this article will also argue that if machines are programmed for affective behavior, they should also be given some ethical scaffolding.

摘要

临床人工智能(AI)的批评者认为,这项技术在伦理上是有害的,因为它可能通过消除道德同理心而导致医患关系(DPR)的非人性化,而道德同理心被视为一种独特的人类特质。临床同理心的益处——即在临床环境中应用的道德同理心——受到广泛赞誉,但这种赞誉往往不加质疑且缺乏背景。在本文中,我将论证基于同理心诉求对临床人工智能的批评是错误的。正如心理学和哲学研究表明的那样,同理心会导致某些类型的有偏见的推理和选择。这些同理心的偏见持续影响着医患关系。同理心可能导致片面的判断和不对称的医患关系,以及患者治疗上的差异,从而损害对患者自主性和平等性的尊重。未来工程师在设计情感人工系统时应考虑同理心的缺陷。一些伦理学家认为,同情和怜悯的本质(即在保持一定平衡距离的同时表现出情感上的关切)在临床环境中比换位思考更有益。然而,这些观点似乎并未影响到人工智能的争论。因此,本文还将论证,如果机器被编程用于情感行为,它们也应该被赋予一些伦理框架。

相似文献

1
Should Doctor Robot possess moral empathy?医生机器人应该具备道德同理心吗?
Bioethics. 2025 Jan;39(1):98-107. doi: 10.1111/bioe.13345. Epub 2024 Aug 24.
2
Critiquing the Reasons for Making Artificial Moral Agents.批判制造人工道德代理的原因。
Sci Eng Ethics. 2019 Jun;25(3):719-735. doi: 10.1007/s11948-018-0030-8. Epub 2018 Feb 19.
3
Building Moral Robots: Ethical Pitfalls and Challenges.构建道德机器人:伦理陷阱与挑战。
Sci Eng Ethics. 2020 Feb;26(1):141-157. doi: 10.1007/s11948-019-00084-5. Epub 2019 Jan 30.
5
The desired moral attitude of the physician: (I) empathy.医生应有的道德态度:(一)同理心。
Med Health Care Philos. 2012 May;15(2):103-13. doi: 10.1007/s11019-011-9366-4.
6
Compassion, reason, and moral judgment.同情、理性和道德判断。
Camb Q Healthc Ethics. 1995 Fall;4(4):466-75. doi: 10.1017/s0963180100006290.
7
The care perspective and autonomy.关怀视角与自主性。
Med Health Care Philos. 2001;4(3):289-94. doi: 10.1023/a:1012048907443.
9
Your pain is not mine: A critique of clinical empathy.你的痛苦与我无关:对临床同理心的批判。
Bioethics. 2022 Jun;36(5):486-493. doi: 10.1111/bioe.12980. Epub 2021 Dec 12.

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验