Center for Psychological Sciences, Zhejiang University, 310063 Hangzhou, Zhejiang, China; Department of Psychology and Behavioral Sciences, Zhejiang University, 310030 Hangzhou, Zhejiang, China.
Center for Psychological Sciences, Zhejiang University, 310063 Hangzhou, Zhejiang, China.
Cognition. 2023 Oct;239:105575. doi: 10.1016/j.cognition.2023.105575. Epub 2023 Jul 28.
There is an increasing interest in understanding human-machine differences in morality. Prior research relying on Trolley-like, moral-impersonal dilemmas suggests that people might apply similar norms to humans and machines but judge their identical decisions differently. We examined people's moral norm imposed on humans and robots (Study 1) and moral judgment of their decisions (Study 2) in Trolley and Footbridge dilemmas. Participants imposed similar, utilitarian norms to them in Trolley but different norms in Footbridge where fewer participants thought humans versus robots should take action in the moral-personal dilemma. Unlike previous research, we witnessed a norm-judgment symmetry that prospective norm aligns with retrospective judgment. The more required decision was judged more moral across agents and dilemmas. We discussed the theoretical implications for machine morality.
人们越来越感兴趣于理解人机道德差异。先前依赖于类似电车的道德非个人困境的研究表明,人们可能对人和机器适用类似的规范,但对其相同的决策进行不同的判断。我们在电车和天桥困境中研究了人们对人和机器人施加的道德规范(研究 1)以及对其决策的道德判断(研究 2)。参与者在电车中对他们施加了类似的功利主义规范,但在天桥中施加了不同的规范,在天桥中,较少的参与者认为人类相对于机器人应该在道德个人困境中采取行动。与之前的研究不同,我们看到了规范判断的对称,即预期规范与回溯判断相一致。跨主体和困境,要求更多的决策被判断为更符合道德。我们讨论了机器道德的理论意义。