Suppr超能文献

机器和人类在牺牲道德困境中的角色:需要相似但评判不同?

Machines and humans in sacrificial moral dilemmas: Required similarly but judged differently?

机构信息

Center for Psychological Sciences, Zhejiang University, 310063 Hangzhou, Zhejiang, China; Department of Psychology and Behavioral Sciences, Zhejiang University, 310030 Hangzhou, Zhejiang, China.

Center for Psychological Sciences, Zhejiang University, 310063 Hangzhou, Zhejiang, China.

出版信息

Cognition. 2023 Oct;239:105575. doi: 10.1016/j.cognition.2023.105575. Epub 2023 Jul 28.

Abstract

There is an increasing interest in understanding human-machine differences in morality. Prior research relying on Trolley-like, moral-impersonal dilemmas suggests that people might apply similar norms to humans and machines but judge their identical decisions differently. We examined people's moral norm imposed on humans and robots (Study 1) and moral judgment of their decisions (Study 2) in Trolley and Footbridge dilemmas. Participants imposed similar, utilitarian norms to them in Trolley but different norms in Footbridge where fewer participants thought humans versus robots should take action in the moral-personal dilemma. Unlike previous research, we witnessed a norm-judgment symmetry that prospective norm aligns with retrospective judgment. The more required decision was judged more moral across agents and dilemmas. We discussed the theoretical implications for machine morality.

摘要

人们越来越感兴趣于理解人机道德差异。先前依赖于类似电车的道德非个人困境的研究表明,人们可能对人和机器适用类似的规范,但对其相同的决策进行不同的判断。我们在电车和天桥困境中研究了人们对人和机器人施加的道德规范(研究 1)以及对其决策的道德判断(研究 2)。参与者在电车中对他们施加了类似的功利主义规范,但在天桥中施加了不同的规范,在天桥中,较少的参与者认为人类相对于机器人应该在道德个人困境中采取行动。与之前的研究不同,我们看到了规范判断的对称,即预期规范与回溯判断相一致。跨主体和困境,要求更多的决策被判断为更符合道德。我们讨论了机器道德的理论意义。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验