Suppr超能文献

人们反对机器做出道德决策。

People are averse to machines making moral decisions.

机构信息

Department of Psychology and Neuroscience, University of North Carolina at Chapel Hill, 235 E Cameron Ave, Chapel Hill, NC 27514, USA.

Department of Psychology and Neuroscience, University of North Carolina at Chapel Hill, 235 E Cameron Ave, Chapel Hill, NC 27514, USA.

出版信息

Cognition. 2018 Dec;181:21-34. doi: 10.1016/j.cognition.2018.08.003. Epub 2018 Aug 11.

Abstract

Do people want autonomous machines making moral decisions? Nine studies suggest that that the answer is 'no'-in part because machines lack a complete mind. Studies 1-6 find that people are averse to machines making morally-relevant driving, legal, medical, and military decisions, and that this aversion is mediated by the perception that machines can neither fully think nor feel. Studies 5-6 find that this aversion exists even when moral decisions have positive outcomes. Studies 7-9 briefly investigate three potential routes to increasing the acceptability of machine moral decision-making: limiting the machine to an advisory role (Study 7), increasing machines' perceived experience (Study 8), and increasing machines' perceived expertise (Study 9). Although some of these routes show promise, the aversion to machine moral decision-making is difficult to eliminate. This aversion may prove challenging for the integration of autonomous technology in moral domains including medicine, the law, the military, and self-driving vehicles.

摘要

人们希望自主机器做出道德决策吗?九项研究表明,答案是否定的——部分原因是机器缺乏完整的思维。研究 1-6 发现,人们反对机器做出与道德相关的驾驶、法律、医疗和军事决策,这种反对是由机器既不能完全思考也不能完全感受的看法所介导的。研究 5-6 发现,即使道德决策有积极的结果,这种反对仍然存在。研究 7-9 简要探讨了三种可能增加机器道德决策可接受性的途径:将机器限制在咨询角色(研究 7)、增加机器的感知经验(研究 8)和增加机器的感知专业知识(研究 9)。尽管这些途径中的一些显示出了希望,但对机器道德决策的反对却难以消除。这种对机器道德决策的反对可能会对自主技术在包括医学、法律、军事和自动驾驶汽车在内的道德领域的整合构成挑战。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验