Cameron David, Collins Emily C, de Saille Stevienna, Eimontaite Iveta, Greenwood Alice, Law James
Information School, University of Sheffield, Sheffield, S10 2TN UK.
Institute of Experiential Robotics, Northeastern University, Boston, MA 02115 USA.
Int J Soc Robot. 2024;16(6):1405-1418. doi: 10.1007/s12369-023-01048-3. Epub 2023 Sep 13.
There is an increasing interest in considering, measuring, and implementing trust in human-robot interaction (HRI). New avenues in this field include identifying social means for robots to influence trust, and identifying social aspects of trust such as a perceptions of robots' integrity, sincerity or even benevolence. However, questions remain regarding robots' authenticity in obtaining trust through social means and their capacity to increase such experiences through social interaction with users. We propose that the dyadic model of HRI misses a key complexity: a robot's trustworthiness may be contingent on the user's relationship with, and opinion of, the individual or organisation deploying the robot (termed here, Deployer). We present a case study in three parts on researching HRI and a LEGO Serious Play focus group on care robotics to indicate how Users' trust towards the Deployer can affect trust towards robots and robotic research. Our Social Triad model (User, Robot, Deployer) offers novel avenues for exploring trust in a social context.
The online version contains supplementary material available at 10.1007/s12369-023-01048-3.
人们对在人机交互(HRI)中考虑、衡量和实现信任的兴趣与日俱增。该领域的新途径包括确定机器人影响信任的社会手段,以及确定信任的社会方面,如对机器人正直、真诚甚至善意的认知。然而,关于机器人通过社会手段获得信任的真实性以及它们通过与用户的社会互动增加此类体验的能力,仍存在问题。我们认为,人机交互的二元模型忽略了一个关键的复杂性:机器人的可信赖性可能取决于用户与部署机器人的个人或组织(此处称为“部署者”)的关系和看法。我们展示了一个关于研究人机交互的三部分案例研究以及一个关于护理机器人的乐高认真玩焦点小组,以表明用户对部署者的信任如何影响对机器人和机器人研究的信任。我们的社会三元模型(用户、机器人、部署者)为在社会背景下探索信任提供了新途径。
在线版本包含可在10.1007/s12369-023-01048-3获取的补充材料。