• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于从事双向通信的社交机器人的多模态情感人机交互架构。

A Multimodal Emotional Human-Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication.

出版信息

IEEE Trans Cybern. 2021 Dec;51(12):5954-5968. doi: 10.1109/TCYB.2020.2974688. Epub 2021 Dec 22.

DOI:10.1109/TCYB.2020.2974688
PMID:32149676
Abstract

For social robots to effectively engage in human-robot interaction (HRI), they need to be able to interpret human affective cues and to respond appropriately via display of their own emotional behavior. In this article, we present a novel multimodal emotional HRI architecture to promote natural and engaging bidirectional emotional communications between a social robot and a human user. User affect is detected using a unique combination of body language and vocal intonation, and multimodal classification is performed using a Bayesian Network. The Emotionally Expressive Robot utilizes the user's affect to determine its own emotional behavior via an innovative two-layer emotional model consisting of deliberative (hidden Markov model) and reactive (rule-based) layers. The proposed architecture has been implemented via a small humanoid robot to perform diet and fitness counseling during HRI. In order to evaluate the Emotionally Expressive Robot's effectiveness, a Neutral Robot that can detect user affects but lacks an emotional display, was also developed. A between-subjects HRI experiment was conducted with both types of robots. Extensive results have shown that both robots can effectively detect user affect during the real-time HRI. However, the Emotionally Expressive Robot can appropriately determine its own emotional response based on the situation at hand and, therefore, induce more user positive valence and less negative arousal than the Neutral Robot.

摘要

为了使社交机器人能够有效地进行人机交互(HRI),它们需要能够解释人类的情感线索,并通过展示自己的情感行为来做出适当的反应。在本文中,我们提出了一种新颖的多模态情感 HRI 架构,以促进社交机器人和人类用户之间自然而引人入胜的双向情感交流。用户的情感是通过独特的身体语言和语音语调组合来检测的,使用贝叶斯网络进行多模态分类。情感表达机器人利用用户的情感通过创新的两层情感模型来确定自己的情感行为,该模型由深思熟虑的(隐马尔可夫模型)和反应性的(基于规则的)层组成。该架构已通过一个小型人形机器人实现,以便在 HRI 期间进行饮食和健身咨询。为了评估情感表达机器人的有效性,还开发了一种能够检测用户情感但缺乏情感显示的中性机器人。使用这两种类型的机器人进行了一项被试间的 HRI 实验。广泛的结果表明,这两种机器人都可以在实时 HRI 中有效地检测用户的情感。然而,情感表达机器人可以根据当前情况适当确定自己的情感反应,因此,与中性机器人相比,它可以引起更多的用户正效价和更少的负唤醒。

相似文献

1
A Multimodal Emotional Human-Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication.一种用于从事双向通信的社交机器人的多模态情感人机交互架构。
IEEE Trans Cybern. 2021 Dec;51(12):5954-5968. doi: 10.1109/TCYB.2020.2974688. Epub 2021 Dec 22.
2
The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study.连贯机器人行为与具身性在人机交互中情绪感知与识别中的作用:实验研究
JMIR Hum Factors. 2024 Jan 26;11:e45494. doi: 10.2196/45494.
3
Investigating Strategies for Robot Persuasion in Social Human-Robot Interaction.社交人机交互中机器人说服策略的研究
IEEE Trans Cybern. 2022 Jan;52(1):641-653. doi: 10.1109/TCYB.2020.2987463. Epub 2022 Jan 11.
4
Real-time emotion generation in human-robot dialogue using large language models.使用大语言模型在人机对话中进行实时情感生成
Front Robot AI. 2023 Dec 1;10:1271610. doi: 10.3389/frobt.2023.1271610. eCollection 2023.
5
Empathy in Human-Robot Interaction: Designing for Social Robots.人机交互中的同理心:社交机器人的设计。
Int J Environ Res Public Health. 2022 Feb 8;19(3):1889. doi: 10.3390/ijerph19031889.
6
Classifying a Person's Degree of Accessibility From Natural Body Language During Social Human-Robot Interactions.从社交人机交互中的自然体态语言对人的可及性程度进行分类。
IEEE Trans Cybern. 2017 Feb;47(2):524-538. doi: 10.1109/TCYB.2016.2520367. Epub 2016 Feb 12.
7
Toward understanding social cues and signals in human-robot interaction: effects of robot gaze and proxemic behavior.理解人机交互中的社会线索和信号:机器人注视和空间行为的影响。
Front Psychol. 2013 Nov 27;4:859. doi: 10.3389/fpsyg.2013.00859. eCollection 2013.
8
Emotion attribution to a non-humanoid robot in different social situations.在不同社交情境中对非类人机器人的情感归因。
PLoS One. 2014 Dec 31;9(12):e114207. doi: 10.1371/journal.pone.0114207. eCollection 2014.
9
Classifying human emotions in HRI: applying global optimization model to EEG brain signals.人机交互中人类情绪的分类:将全局优化模型应用于脑电图脑信号
Front Neurorobot. 2023 Oct 10;17:1191127. doi: 10.3389/fnbot.2023.1191127. eCollection 2023.
10
Modelling Multimodal Dialogues for Social Robots Using Communicative Acts.使用交际行为对社交机器人进行多模态对话建模。
Sensors (Basel). 2020 Jun 18;20(12):3440. doi: 10.3390/s20123440.

引用本文的文献

1
Evaluating the effects of active social touch and robot expressiveness on user attitudes and behaviour in human-robot interaction.评估主动社交触摸和机器人表现力对人机交互中用户态度和行为的影响。
Sci Rep. 2025 May 27;15(1):18483. doi: 10.1038/s41598-025-01490-5.
2
Making social robots adaptable and to some extent educable by a marketplace for the selection and adjustment of different interaction characters living inside a single robot.通过一个市场让社交机器人具有适应性,并在一定程度上可被训练,以便选择和调整单个机器人内部不同的交互特征。
Front Robot AI. 2025 Apr 8;12:1534346. doi: 10.3389/frobt.2025.1534346. eCollection 2025.
3
Multimodal fusion-powered English speaking robot.
多模态融合驱动的英语口语机器人。
Front Neurorobot. 2024 Nov 15;18:1478181. doi: 10.3389/fnbot.2024.1478181. eCollection 2024.
4
A Bio-Inspired Dopamine Model for Robots with Autonomous Decision-Making.一种用于具有自主决策能力机器人的受生物启发的多巴胺模型。
Biomimetics (Basel). 2024 Aug 21;9(8):504. doi: 10.3390/biomimetics9080504.
5
Creating Expressive Social Robots That Convey Symbolic and Spontaneous Communication.创造能够传达符号化和自然交流的富有表现力的社交机器人。
Sensors (Basel). 2024 Jun 5;24(11):3671. doi: 10.3390/s24113671.
6
Recommendations for designing conversational companion robots with older adults through foundation models.通过基础模型与老年人设计对话式陪伴机器人的建议。
Front Robot AI. 2024 May 27;11:1363713. doi: 10.3389/frobt.2024.1363713. eCollection 2024.
7
Interactive method research of dual mode information coordination integration for astronaut gesture and eye movement signals based on hybrid model.基于混合模型的航天员手势与眼动信号双模态信息协同融合交互方法研究
Sci China Technol Sci. 2023;66(6):1717-1733. doi: 10.1007/s11431-022-2368-y. Epub 2023 May 9.
8
An analysis of design recommendations for socially assistive robot helpers for effective human-robot interactions in senior care.用于老年护理中有效人机交互的社交辅助机器人助手的设计建议分析。
J Rehabil Assist Technol Eng. 2022 Jun 18;9:20556683221101389. doi: 10.1177/20556683221101389. eCollection 2022 Jan-Dec.
9
Group Emotion Detection Based on Social Robot Perception.基于社交机器人感知的群体情绪检测。
Sensors (Basel). 2022 May 14;22(10):3749. doi: 10.3390/s22103749.
10
Correlated expression of the body, face, and voice during character portrayal in actors.演员在角色刻画中身体、面部和声音的相关表达。
Sci Rep. 2022 May 18;12(1):8253. doi: 10.1038/s41598-022-12184-7.