Bennahum David A
Camb Q Healthc Ethics. 2020 Apr;29(2):327-329. doi: 10.1017/S0963180119001117.
How can an individual's Moral Compass address the question of whether or not to help a patient to shorten and end his or her life? Moral Compass has been defined as that set of values and experiences that guides each individual's decisions and conduct in relation to others and to society. Can a robot be programmed to have a moral compass? If we were only considering rules of conduct, then perhaps yes, that would be possible. We could establish a series of rules and sanctions that a computer assisted robot could rigorously apply for any violation. The state and many religions already do that, and many individuals are quite comfortable with rigorous, unbendable rules. Most rules, however, have exceptions, so perhaps the robots of the future can be designed to be flexible, that is, human.
一个人的道德准则如何解决是否帮助患者缩短并结束其生命这一问题呢?道德准则被定义为那套引导每个人在与他人及社会关系中做出决策和行为的价值观与经历。机器人能被编程拥有道德准则吗?如果我们仅考虑行为规则,那么或许可以,这是有可能的。我们可以建立一系列规则及制裁措施,计算机辅助机器人能够严格执行以应对任何违规行为。国家和许多宗教已经在这么做了,而且许多人对严格、不可变通的规则相当认同。然而,大多数规则都有例外情况,所以或许未来的机器人可以被设计得具有灵活性,也就是说,具有人性。