• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

当我看到你在看时,我会更快地到达:面对面合作中的人类-人类和人类-机器人注视效应。

I Reach Faster When I See You Look: Gaze Effects in Human-Human and Human-Robot Face-to-Face Cooperation.

机构信息

Robot Cognition Laboratory, SBRI INSERM U846, Université de Lyon Lyon, France.

出版信息

Front Neurorobot. 2012 May 3;6:3. doi: 10.3389/fnbot.2012.00003. eCollection 2012.

DOI:10.3389/fnbot.2012.00003
PMID:22563315
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC3342577/
Abstract

Human-human interaction in natural environments relies on a variety of perceptual cues. Humanoid robots are becoming increasingly refined in their sensorimotor capabilities, and thus should now be able to manipulate and exploit these social cues in cooperation with their human partners. Previous studies have demonstrated that people follow human and robot gaze, and that it can help them to cope with spatially ambiguous language. Our goal is to extend these findings into the domain of action, to determine how human and robot gaze can influence the speed and accuracy of human action. We report on results from a human-human cooperation experiment demonstrating that an agent's vision of her/his partner's gaze can significantly improve that agent's performance in a cooperative task. We then implement a heuristic capability to generate such gaze cues by a humanoid robot that engages in the same cooperative interaction. The subsequent human-robot experiments demonstrate that a human agent can indeed exploit the predictive gaze of their robot partner in a cooperative task. This allows us to render the humanoid robot more human-like in its ability to communicate with humans. The long term objectives of the work are thus to identify social cooperation cues, and to validate their pertinence through implementation in a cooperative robot. The current research provides the robot with the capability to produce appropriate speech and gaze cues in the context of human-robot cooperation tasks. Gaze is manipulated in three conditions: Full gaze (coordinated eye and head), eyes hidden with sunglasses, and head fixed. We demonstrate the pertinence of these cues in terms of statistical measures of action times for humans in the context of a cooperative task, as gaze significantly facilitates cooperation as measured by human response times.

摘要

人类在自然环境中的互动依赖于各种感知线索。人形机器人在其感知运动能力方面变得越来越精细,因此现在应该能够与人类伙伴合作,操纵和利用这些社交线索。以前的研究表明,人们会跟随人和机器人的目光,这可以帮助他们应对空间上模糊的语言。我们的目标是将这些发现扩展到行动领域,以确定人和机器人的目光如何影响人类行动的速度和准确性。我们报告了一项人类合作实验的结果,该实验表明,代理人对其伙伴目光的看法可以显著提高代理人在合作任务中的表现。然后,我们实施了一种启发式能力,通过参与相同合作交互的人形机器人生成这种目光线索。随后的人机实验表明,人类代理确实可以在合作任务中利用其机器人伙伴的预测目光。这使得人形机器人在与人类交流的能力上更具人性化。因此,该工作的长期目标是识别社会合作线索,并通过在协作机器人中实现来验证其相关性。目前的研究为机器人在人机合作任务中提供了生成适当言语和目光线索的能力。在三种条件下操纵目光:全目光(协调的眼睛和头部)、墨镜隐藏的眼睛和固定的头部。我们根据合作任务中人类的动作时间的统计度量来证明这些线索的相关性,因为目光显著地通过人类的反应时间来促进合作。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/87a2bcb52539/fnbot-06-00003-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/ce7d311bb377/fnbot-06-00003-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/e548e2b6ac7a/fnbot-06-00003-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/ec199ede9fb1/fnbot-06-00003-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/d3bab27ac579/fnbot-06-00003-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/8d5bf062a2e0/fnbot-06-00003-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/c6fcbb3ca5f9/fnbot-06-00003-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/8c162956bf8d/fnbot-06-00003-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/980b497ced35/fnbot-06-00003-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/87a2bcb52539/fnbot-06-00003-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/ce7d311bb377/fnbot-06-00003-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/e548e2b6ac7a/fnbot-06-00003-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/ec199ede9fb1/fnbot-06-00003-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/d3bab27ac579/fnbot-06-00003-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/8d5bf062a2e0/fnbot-06-00003-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/c6fcbb3ca5f9/fnbot-06-00003-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/8c162956bf8d/fnbot-06-00003-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/980b497ced35/fnbot-06-00003-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0b46/3342577/87a2bcb52539/fnbot-06-00003-g009.jpg

相似文献

1
I Reach Faster When I See You Look: Gaze Effects in Human-Human and Human-Robot Face-to-Face Cooperation.当我看到你在看时,我会更快地到达:面对面合作中的人类-人类和人类-机器人注视效应。
Front Neurorobot. 2012 May 3;6:3. doi: 10.3389/fnbot.2012.00003. eCollection 2012.
2
See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction.你看我,我看你:眼神交流在多模态人机交互中的作用。
ACM Trans Interact Intell Syst. 2016 May;6(1). doi: 10.1145/2882970.
3
Toward an Attentive Robotic Architecture: Learning-Based Mutual Gaze Estimation in Human-Robot Interaction.迈向一种专注的机器人架构:人机交互中基于学习的相互注视估计
Front Robot AI. 2022 Mar 7;9:770165. doi: 10.3389/frobt.2022.770165. eCollection 2022.
4
Robot Gaze Behavior Affects Honesty in Human-Robot Interaction.机器人注视行为会影响人机交互中的诚实度。
Front Artif Intell. 2021 May 11;4:663190. doi: 10.3389/frai.2021.663190. eCollection 2021.
5
Robot Faces that Follow Gaze Facilitate Attentional Engagement and Increase Their Likeability.跟随目光的机器人面孔有助于吸引注意力并增加其受欢迎程度。
Front Psychol. 2018 Feb 5;9:70. doi: 10.3389/fpsyg.2018.00070. eCollection 2018.
6
Role of Gaze Cues in Interpersonal Motor Coordination: Towards Higher Affiliation in Human-Robot Interaction.注视线索在人际运动协调中的作用:迈向人机交互中更高程度的情感联系
PLoS One. 2016 Jun 9;11(6):e0156874. doi: 10.1371/journal.pone.0156874. eCollection 2016.
7
Human-like object tracking and gaze estimation with PKD android.使用PKD安卓系统进行类人物体跟踪和注视估计。
Proc SPIE Int Soc Opt Eng. 2016 May;9859. doi: 10.1117/12.2224382.
8
The understanding of congruent and incongruent referential gaze in 17-month-old infants: an eye-tracking study comparing human and robot.17 个月大婴儿对一致和不一致参照性注视的理解:一项比较人类和机器人的眼动研究
Sci Rep. 2020 Jul 17;10(1):11918. doi: 10.1038/s41598-020-69140-6.
9
Interaction With Social Robots: Improving Gaze Toward Face but Not Necessarily Joint Attention in Children With Autism Spectrum Disorder.与社交机器人的互动:改善自闭症谱系障碍儿童对脸部的注视,但不一定能改善共同注意。
Front Psychol. 2019 Jul 5;10:1503. doi: 10.3389/fpsyg.2019.01503. eCollection 2019.
10
Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions.自闭症青少年在多手势的双人互动中,通过适应性地使用目光来促进共同关注。
Autism. 2024 Jun;28(6):1565-1581. doi: 10.1177/13623613231211967. Epub 2023 Nov 24.

引用本文的文献

1
Gaze detection as a social cue to initiate natural human-robot collaboration in an assembly task.在装配任务中,将注视检测作为启动自然人机协作的社交线索。
Front Robot AI. 2024 Jul 17;11:1394379. doi: 10.3389/frobt.2024.1394379. eCollection 2024.
2
The potential of robot eyes as predictive cues in HRI-an eye-tracking study.机器人眼睛在人机交互中作为预测线索的潜力——一项眼动追踪研究
Front Robot AI. 2023 Jul 28;10:1178433. doi: 10.3389/frobt.2023.1178433. eCollection 2023.
3
Humans Can't Resist Robot Eyes - Reflexive Cueing With Pseudo-Social Stimuli.

本文引用的文献

1
Coordinating spatial referencing using shared gaze.使用共享注视协调空间参照。
Psychon Bull Rev. 2010 Oct;17(5):718-24. doi: 10.3758/PBR.17.5.718.
2
Linking language with embodied and teleological representations of action for humanoid cognition.将语言与人类认知的具身和目的论的动作表示联系起来。
Front Neurorobot. 2010 Jun 3;4:8. doi: 10.3389/fnbot.2010.00008. eCollection 2010.
3
Why bodies? Twelve reasons for including bodily expressions in affective neuroscience.为什么要研究身体?将身体表现纳入情感神经科学研究的 12 个理由
人类无法抗拒机器人的眼睛——伪社会刺激引发的反射性提示。
Front Robot AI. 2022 Mar 23;9:848295. doi: 10.3389/frobt.2022.848295. eCollection 2022.
4
Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction.感知只有在共享时才真实:人机交互中协作共享感知的数学模型。
Front Robot AI. 2022 Jun 15;9:733954. doi: 10.3389/frobt.2022.733954. eCollection 2022.
5
Enhancing the Sense of Attention from an Assistance Mobile Robot by Improving Eye-Gaze Contact from Its Iconic Face Displayed on a Flat Screen.通过改善平面屏幕上标志性面孔的眼神接触来增强辅助移动机器人的注意力感知。
Sensors (Basel). 2022 Jun 4;22(11):4282. doi: 10.3390/s22114282.
6
Toward an Attentive Robotic Architecture: Learning-Based Mutual Gaze Estimation in Human-Robot Interaction.迈向一种专注的机器人架构:人机交互中基于学习的相互注视估计
Front Robot AI. 2022 Mar 7;9:770165. doi: 10.3389/frobt.2022.770165. eCollection 2022.
7
Gaze Control of a Robotic Head for Realistic Interaction With Humans.用于与人类进行逼真交互的机器人头部的注视控制。
Front Neurorobot. 2020 Jun 17;14:34. doi: 10.3389/fnbot.2020.00034. eCollection 2020.
8
Directing Attention Through Gaze Hints Improves Task Solving in Human-Humanoid Interaction.通过注视提示引导注意力可改善人机交互中的任务解决能力。
Int J Soc Robot. 2018;10(3):343-355. doi: 10.1007/s12369-018-0473-8. Epub 2018 Apr 6.
9
Contribution of Developmental Psychology to the Study of Social Interactions: Some Factors in Play, Joint Attention and Joint Action and Implications for Robotics.发展心理学对社会互动研究的贡献:游戏、共同注意和共同行动中的一些因素及其对机器人技术的启示
Front Psychol. 2018 Oct 19;9:1992. doi: 10.3389/fpsyg.2018.01992. eCollection 2018.
10
Preferred Interaction Styles for Human-Robot Collaboration Vary Over Tasks With Different Action Types.人机协作的首选交互方式会因具有不同动作类型的任务而有所不同。
Front Neurorobot. 2018 Jul 4;12:36. doi: 10.3389/fnbot.2018.00036. eCollection 2018.
Philos Trans R Soc Lond B Biol Sci. 2009 Dec 12;364(1535):3475-84. doi: 10.1098/rstb.2009.0190.
4
Social cognitive neuroscience and humanoid robotics.社会认知神经科学与人形机器人技术。
J Physiol Paris. 2009 Sep-Dec;103(3-5):286-95. doi: 10.1016/j.jphysparis.2009.08.011. Epub 2009 Aug 7.
5
Anticipating intentional actions: the effect of eye gaze direction on the judgment of head rotation.预测有意动作:目光注视方向对头部旋转判断的影响。
Cognition. 2009 Sep;112(3):423-34. doi: 10.1016/j.cognition.2009.06.011. Epub 2009 Jul 16.
6
Specific and common brain regions involved in the perception of faces and bodies and the representation of their emotional expressions.参与面部和身体感知及其情感表达表征的特定和共同脑区。
Soc Neurosci. 2009;4(2):101-20. doi: 10.1080/17470910701865367.
7
Is there a direct link between gaze perception and joint attention behaviours? Effects of gaze contrast polarity on oculomotor behaviour.注视感知与联合注意行为之间是否存在直接联系?注视对比度极性对眼动行为的影响。
Exp Brain Res. 2009 Apr;194(3):347-57. doi: 10.1007/s00221-009-1706-8. Epub 2009 Jan 30.
8
Effects of head orientation on gaze perception: how positive congruency effects can be reversed.头部朝向对注视感知的影响:正向一致性效应如何被逆转。
Q J Exp Psychol (Hove). 2008 Mar;61(3):491-504. doi: 10.1080/17470210701255457.
9
Spatial coding and central patterns: is there something special about the eyes?空间编码与中枢模式:眼睛有什么特别之处吗?
Can J Exp Psychol. 2007 Jun;61(2):79-90. doi: 10.1037/cep2007_2_79.
10
Separate coding of different gaze directions in the superior temporal sulcus and inferior parietal lobule.颞上沟和顶下小叶中不同注视方向的单独编码。
Curr Biol. 2007 Jan 9;17(1):20-5. doi: 10.1016/j.cub.2006.10.052.