• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

教NICO如何抓取:关于跨模态社会互动作为机器人向人类学习的关键因素的实证研究。

Teaching NICO How to Grasp: An Empirical Study on Crossmodal Social Interaction as a Key Factor for Robots Learning From Humans.

作者信息

Kerzel Matthias, Pekarek-Rosin Theresa, Strahl Erik, Heinrich Stefan, Wermter Stefan

机构信息

Knowledge Technology, Department of Informatics, University of Hamburg, Hamburg, Germany.

出版信息

Front Neurorobot. 2020 Jun 9;14:28. doi: 10.3389/fnbot.2020.00028. eCollection 2020.

DOI:10.3389/fnbot.2020.00028
PMID:32581759
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7297081/
Abstract

To overcome novel challenges in complex domestic environments, humanoid robots can learn from human teachers. We propose that the capability for social interaction should be a key factor in this teaching process and benefits both the subjective experience of the human user and the learning process itself. To support our hypothesis, we present a Human-Robot Interaction study on human-assisted visuomotor learning with the robot NICO, the Neuro-Inspired COmpanion, a child-sized humanoid. NICO is a flexible, social platform with sensing and manipulation abilities. We give a detailed description of NICO's design and a comprehensive overview of studies that use or evaluate NICO. To engage in social interaction, NICO can express stylized facial expressions and utter speech via an Embodied Dialogue System. NICO is characterized in particular by combining these social interaction capabilities with the abilities for human-like object manipulation and crossmodal perception. In the presented study, NICO acquires visuomotor grasping skills by interacting with its environment. In contrast to methods like motor babbling, the learning process is, in part, supported by a human teacher. To begin the learning process, an object is placed into NICO's hand, and if this object is accidentally dropped, the human assistant has to recover it. The study is conducted with 24 participants with little or no prior experience with robots. In the experimental condition, assistance is actively requested by NICO via the Embodied Dialogue System. In the condition, instructions are given by a human experimenter, while NICO remains silent. Evaluation using established questionnaires like Godspeed, Mind Perception, and Uncanny Valley Indices, along with a structured interview and video analysis of the interaction, show that the robot's active requests for assistance foster the participant's engagement and benefit the learning process. This result supports the hypothesis that the ability for social interaction is a key factor for companion robots that learn with the help of non-expert teachers, as these robots become capable of communicating active requests or questions that are vital to their learning process. We also show how the design of NICO both enables and is driven by this approach.

摘要

为了应对复杂家庭环境中的新挑战,人形机器人可以向人类教师学习。我们提出,社交互动能力应是这一教学过程中的关键因素,它对人类用户的主观体验和学习过程本身都有益处。为了支持我们的假设,我们展示了一项关于机器人NICO(即神经启发式伙伴,一个儿童尺寸的人形机器人)在人类辅助下进行视觉运动学习的人机交互研究。NICO是一个具备传感和操作能力的灵活社交平台。我们详细描述了NICO的设计,并全面概述了使用或评估NICO的研究。为了进行社交互动,NICO可以通过一个具身对话系统表达程式化的面部表情并发出语音。NICO的特别之处在于将这些社交互动能力与类人物体操作和跨模态感知能力相结合。在本研究中,NICO通过与环境交互来获取视觉运动抓握技能。与诸如运动咿呀学语等方法不同,学习过程部分得到了人类教师的支持。学习过程开始时,一个物体被放入NICO手中,如果该物体意外掉落,人类助手必须将其找回。该研究针对24名此前很少或没有机器人相关经验的参与者展开。在实验条件下,NICO通过具身对话系统主动请求协助。在另一条件下,由人类实验者给出指令而NICO保持沉默。使用诸如“神速”、“心智感知”和“恐怖谷指数”等既定问卷进行评估,同时结合结构化访谈以及对交互过程的视频分析,结果表明机器人主动请求协助促进了参与者的参与度,并有益于学习过程。这一结果支持了以下假设:社交互动能力是在非专业教师帮助下学习的陪伴机器人的关键因素,因为这些机器人能够传达对其学习过程至关重要的主动请求或问题。我们还展示了NICO的设计如何促成并受这种方法驱动。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/04c77c4a442b/fnbot-14-00028-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/9c20d63d6a2c/fnbot-14-00028-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/0dd7af0cca72/fnbot-14-00028-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/a908c177ead0/fnbot-14-00028-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/cc6fc6796a25/fnbot-14-00028-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/75b803dd19be/fnbot-14-00028-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/bc5cc5c4881e/fnbot-14-00028-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/9acdf8121fdb/fnbot-14-00028-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/917351b2ea84/fnbot-14-00028-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/1b92f4964db8/fnbot-14-00028-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/598af66a3349/fnbot-14-00028-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/04c77c4a442b/fnbot-14-00028-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/9c20d63d6a2c/fnbot-14-00028-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/0dd7af0cca72/fnbot-14-00028-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/a908c177ead0/fnbot-14-00028-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/cc6fc6796a25/fnbot-14-00028-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/75b803dd19be/fnbot-14-00028-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/bc5cc5c4881e/fnbot-14-00028-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/9acdf8121fdb/fnbot-14-00028-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/917351b2ea84/fnbot-14-00028-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/1b92f4964db8/fnbot-14-00028-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/598af66a3349/fnbot-14-00028-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c29/7297081/04c77c4a442b/fnbot-14-00028-g0011.jpg

相似文献

1
Teaching NICO How to Grasp: An Empirical Study on Crossmodal Social Interaction as a Key Factor for Robots Learning From Humans.教NICO如何抓取:关于跨模态社会互动作为机器人向人类学习的关键因素的实证研究。
Front Neurorobot. 2020 Jun 9;14:28. doi: 10.3389/fnbot.2020.00028. eCollection 2020.
2
The Effects of Stakeholder Perceptions on the Use of Humanoid Robots in Care for Older Adults: Postinteraction Cross-Sectional Study.利益相关者认知对老年护理用人形机器人使用的影响:互动后横断面研究。
J Med Internet Res. 2023 Aug 4;25:e46617. doi: 10.2196/46617.
3
Human-robot interaction: the impact of robotic aesthetics on anticipated human trust.人机交互:机器人美学对预期人类信任的影响。
PeerJ Comput Sci. 2022 Jan 14;8:e837. doi: 10.7717/peerj-cs.837. eCollection 2022.
4
Affect-Driven Learning of Robot Behaviour for Collaborative Human-Robot Interactions.用于人机协作交互的基于情感驱动的机器人行为学习
Front Robot AI. 2022 Feb 21;9:717193. doi: 10.3389/frobt.2022.717193. eCollection 2022.
5
Can a robot teach me that? Children's ability to imitate robots.机器人能教我这个吗?儿童模仿机器人的能力。
J Exp Child Psychol. 2021 Mar;203:105040. doi: 10.1016/j.jecp.2020.105040. Epub 2020 Dec 7.
6
Determinants of Attitude to a Humanoid Social Robot in Care for Older Adults: A Post-Interaction Study.老年人护理中对人形社交机器人态度的决定因素:互动后研究。
Med Sci Monit. 2023 Sep 8;29:e941205. doi: 10.12659/MSM.941205.
7
Why Robots Should Be Social: Enhancing Machine Learning through Social Human-Robot Interaction.机器人为何应具备社交属性:通过人机社交互动提升机器学习能力
PLoS One. 2015 Sep 30;10(9):e0138061. doi: 10.1371/journal.pone.0138061. eCollection 2015.
8
Age-Related Differences in the Uncanny Valley Effect.年龄相关的“恐怖谷效应”差异。
Gerontology. 2020;66(4):382-392. doi: 10.1159/000507812. Epub 2020 Jun 11.
9
Understanding the Uncanny: Both Atypical Features and Category Ambiguity Provoke Aversion toward Humanlike Robots.理解怪异之感:非典型特征和类别模糊性都会引发对类人机器人的厌恶。
Front Psychol. 2017 Aug 30;8:1366. doi: 10.3389/fpsyg.2017.01366. eCollection 2017.
10
Robots with display screens: a robot with a more humanlike face display is perceived to have more mind and a better personality.带显示屏的机器人:具有更类人面部显示的机器人被认为具有更多的思维和更好的个性。
PLoS One. 2013 Aug 28;8(8):e72589. doi: 10.1371/journal.pone.0072589. eCollection 2013.

引用本文的文献

1
Characterization of Indicators for Adaptive Human-Swarm Teaming.适应性人机协作指标的特征描述
Front Robot AI. 2022 Feb 17;9:745958. doi: 10.3389/frobt.2022.745958. eCollection 2022.
2
Learning Then, Learning Now, and Every Second in Between: Lifelong Learning With a Simulated Humanoid Robot.彼时之学,此时之学,以及其间的每一秒:借助模拟人形机器人进行终身学习。
Front Neurorobot. 2021 Jul 1;15:669534. doi: 10.3389/fnbot.2021.669534. eCollection 2021.
3
An Immersive Investment Game to Study Human-Robot Trust.一款用于研究人机信任的沉浸式投资游戏。

本文引用的文献

1
Crossmodal Language Grounding in an Embodied Neurocognitive Model.具身神经认知模型中的跨模态语言基础
Front Neurorobot. 2020 Oct 14;14:52. doi: 10.3389/fnbot.2020.00052. eCollection 2020.
2
Reachy, a 3D-Printed Human-Like Robotic Arm as a Testbed for Human-Robot Control Strategies.Reachy,一款3D打印的类人机器人手臂,作为人机控制策略的试验平台。
Front Neurorobot. 2019 Aug 14;13:65. doi: 10.3389/fnbot.2019.00065. eCollection 2019.
3
Human-level control through deep reinforcement learning.通过深度强化学习实现人类水平的控制。
Front Robot AI. 2021 Jun 4;8:644529. doi: 10.3389/frobt.2021.644529. eCollection 2021.
4
Crossmodal Language Grounding in an Embodied Neurocognitive Model.具身神经认知模型中的跨模态语言基础
Front Neurorobot. 2020 Oct 14;14:52. doi: 10.3389/fnbot.2020.00052. eCollection 2020.
Nature. 2015 Feb 26;518(7540):529-33. doi: 10.1038/nature14236.
4
Model learning for robot control: a survey.用于机器人控制的模型学习:一项综述。
Cogn Process. 2011 Nov;12(4):319-40. doi: 10.1007/s10339-011-0404-1. Epub 2011 Apr 13.
5
The iCub humanoid robot: an open-systems platform for research in cognitive development.iCub 人形机器人:认知发展研究的开放式系统平台。
Neural Netw. 2010 Oct-Nov;23(8-9):1125-34. doi: 10.1016/j.neunet.2010.08.010. Epub 2010 Sep 22.
6
On seeing human: a three-factor theory of anthropomorphism.见人如己:拟人化的三因素理论。
Psychol Rev. 2007 Oct;114(4):864-86. doi: 10.1037/0033-295X.114.4.864.
7
Dimensions of mind perception.心理感知的维度。
Science. 2007 Feb 2;315(5812):619. doi: 10.1126/science.1134475.
8
Hand synergies during reach-to-grasp.伸手抓握过程中的手部协同作用。
J Neurophysiol. 2001 Dec;86(6):2896-910. doi: 10.1152/jn.2001.86.6.2896.