Suppr超能文献

多模态人机交互中的眨眼同步。

Eyeblink Synchrony in Multimodal Human-Android Interaction.

机构信息

Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka, 560-8531, Japan.

JST ERATO Ishiguro Symbiotic Human Robot Interaction Project, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka, 560-8531, Japan.

出版信息

Sci Rep. 2016 Dec 23;6:39718. doi: 10.1038/srep39718.

Abstract

As the result of recent progress in technology of communication robot, robots are becoming an important social partner for humans. Behavioral synchrony is understood as an important factor in establishing good human-robot relationships. In this study, we hypothesized that biasing a human's attitude toward a robot changes the degree of synchrony between human and robot. We first examined whether eyeblinks were synchronized between a human and an android in face-to-face interaction and found that human listeners' eyeblinks were entrained to android speakers' eyeblinks. This eyeblink synchrony disappeared when the android speaker spoke while looking away from the human listeners but was enhanced when the human participants listened to the speaking android while touching the android's hand. These results suggest that eyeblink synchrony reflects a qualitative state in human-robot interactions.

摘要

由于通信机器人技术的最新进展,机器人正在成为人类的重要社交伙伴。行为同步被理解为建立良好人机关系的重要因素。在这项研究中,我们假设改变人类对机器人的态度会改变人类和机器人之间的同步程度。我们首先检验了在面对面的互动中人类和安卓机器人之间的眨眼是否同步,结果发现人类听众的眨眼会被安卓说话者的眨眼所同步。当安卓说话者看向其他地方而不看人类听众时,这种眨眼同步就会消失,但当人类参与者在听说话的安卓机器人时触摸它的手,这种眨眼同步就会增强。这些结果表明,眨眼同步反映了人类-机器人交互中的一种定性状态。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5180175/1154d94c4a6c/srep39718-f1.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验