Dietrich Manuel, Krüger Matti, Weisswange Thomas H
Honda Research Institute Europe GmbH, Offenbach, Germany.
Honda Research Institute Japan Co Ltd., Saitama, Japan.
Front Robot AI. 2023 Dec 15;10:1236733. doi: 10.3389/frobt.2023.1236733. eCollection 2023.
For robots to become integrated into our daily environment, they must be designed to gain sufficient trust of both users and bystanders. This is in particular important for social robots including those that assume the role of a mediator, working towards positively shaping relationships and interactions between individuals. One crucial factor influencing trust is the appropriate handling of personal information. Previous research on privacy has focused on data collection, secure storage, and abstract third-party disclosure risks. However, robot mediators may face situations where the disclosure of private information about one person to another specific person appears necessary. It is not clear if, how, and to what extent robots should share private information between people. This study presents an online investigation into appropriate robotic disclosure strategies. Using a vignette design, participants were presented with written descriptions of situations where a social robot reveals personal information about its owner to support pro-social human-human interaction. Participants were asked to choose the most appropriate robot behaviors, which differed in the level of information disclosure. We aimed to explore the effects of disclosure context, such as the to the other person and the . The findings indicate that both the information content and relationship configurations significantly influence the perception of appropriate behavior but are not the sole determinants of disclosure-adequacy perception. The results also suggest that expected benefits of disclosure and individual general privacy attitudes serve as additional influential factors. These insights can inform the design of future mediating robots, enabling them to make more privacy-appropriate decisions which could foster trust and acceptance.
为了使机器人融入我们的日常环境,其设计必须能赢得用户和旁观者的充分信任。这对于社交机器人尤为重要,包括那些承担调解人角色、致力于积极塑造个人之间关系和互动的机器人。影响信任的一个关键因素是对个人信息的妥善处理。以往关于隐私的研究主要集中在数据收集、安全存储以及抽象的第三方披露风险上。然而,机器人调解人可能会面临这样的情况,即向另一个特定的人披露关于某个人的私人信息似乎是必要的。目前尚不清楚机器人是否应该、如何以及在何种程度上在人与人之间分享私人信息。本研究针对合适的机器人披露策略展开了一项在线调查。采用vignette设计,向参与者呈现社交机器人为支持亲社会的人际互动而透露其主人个人信息的情境书面描述。要求参与者选择最合适的机器人行为,这些行为在信息披露程度上有所不同。我们旨在探究披露背景的影响,比如对他人的 以及 。研究结果表明,信息内容和关系配置都对适当行为的认知有显著影响,但并非决定披露适当性认知的唯一因素。研究结果还表明,披露的预期益处和个人总体隐私态度是另外的影响因素。这些见解可为未来调解机器人的设计提供参考,使其能够做出更符合隐私要求的决策,从而增进信任和接受度。