• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

复杂环境下人机交互中的参与度检测的实现。

Implementation of Engagement Detection for Human-Robot Interaction in Complex Environments.

机构信息

Mechanical Engineering Department, National Taiwan University, Taipei 10617, Taiwan.

出版信息

Sensors (Basel). 2024 May 22;24(11):3311. doi: 10.3390/s24113311.

DOI:10.3390/s24113311
PMID:38894102
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11174507/
Abstract

This study develops a comprehensive robotic system, termed the robot cognitive system, for complex environments, integrating three models: the engagement model, the intention model, and the human-robot interaction (HRI) model. The system aims to enhance the naturalness and comfort of HRI by enabling robots to detect human behaviors, intentions, and emotions accurately. A novel dual-arm-hand mobile robot, Mobi, was designed to demonstrate the system's efficacy. The engagement model utilizes eye gaze, head pose, and action recognition to determine the suitable moment for interaction initiation, addressing potential eye contact anxiety. The intention model employs sentiment analysis and emotion classification to infer the interactor's intentions. The HRI model, integrated with Google Dialogflow, facilitates appropriate robot responses based on user feedback. The system's performance was validated in a retail environment scenario, demonstrating its potential to improve the user experience in HRIs.

摘要

本研究开发了一种综合机器人系统,称为机器人认知系统,用于复杂环境,集成了三个模型:参与模型、意图模型和人机交互 (HRI) 模型。该系统旨在通过使机器人能够准确检测人类行为、意图和情绪,提高 HRI 的自然性和舒适性。设计了一种新型的双臂手持移动机器人 Mobi 来演示该系统的功效。参与模型利用眼动、头部姿势和动作识别来确定交互启动的合适时刻,解决潜在的眼神接触焦虑问题。意图模型采用情感分析和情绪分类来推断交互者的意图。HRI 模型与 Google Dialogflow 集成,根据用户反馈促进适当的机器人响应。该系统在零售环境场景中的性能得到了验证,展示了其在改善 HRI 用户体验方面的潜力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/4d54ca28344c/sensors-24-03311-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/663a980228f2/sensors-24-03311-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/52f5e5a96b95/sensors-24-03311-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/1d3ffe3d15cb/sensors-24-03311-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/0c4b99c1df28/sensors-24-03311-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/d972a22674a8/sensors-24-03311-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/4fb68849a670/sensors-24-03311-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/d161273ebd98/sensors-24-03311-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/a2a7a7427971/sensors-24-03311-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/f9b957401a73/sensors-24-03311-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/ad761d2c201f/sensors-24-03311-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/3f15b790109e/sensors-24-03311-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/9bb26ed2684b/sensors-24-03311-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/166b0b88b908/sensors-24-03311-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/4d54ca28344c/sensors-24-03311-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/663a980228f2/sensors-24-03311-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/52f5e5a96b95/sensors-24-03311-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/1d3ffe3d15cb/sensors-24-03311-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/0c4b99c1df28/sensors-24-03311-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/d972a22674a8/sensors-24-03311-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/4fb68849a670/sensors-24-03311-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/d161273ebd98/sensors-24-03311-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/a2a7a7427971/sensors-24-03311-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/f9b957401a73/sensors-24-03311-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/ad761d2c201f/sensors-24-03311-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/3f15b790109e/sensors-24-03311-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/9bb26ed2684b/sensors-24-03311-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/166b0b88b908/sensors-24-03311-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/343e/11174507/4d54ca28344c/sensors-24-03311-g014.jpg

相似文献

1
Implementation of Engagement Detection for Human-Robot Interaction in Complex Environments.复杂环境下人机交互中的参与度检测的实现。
Sensors (Basel). 2024 May 22;24(11):3311. doi: 10.3390/s24113311.
2
Group Emotion Detection Based on Social Robot Perception.基于社交机器人感知的群体情绪检测。
Sensors (Basel). 2022 May 14;22(10):3749. doi: 10.3390/s22103749.
3
HRI usability evaluation of interaction modes for a teleoperated agricultural robotic sprayer.用于遥控农业机器人喷雾器的交互模式的人机交互可用性评估。
Appl Ergon. 2017 Jul;62:237-246. doi: 10.1016/j.apergo.2017.03.008. Epub 2017 Mar 22.
4
Adaptive training algorithm for robot-assisted upper-arm rehabilitation, applicable to individualised and therapeutic human-robot interaction.适用于个体化和治疗性人机交互的机器人辅助上臂康复自适应训练算法。
J Neuroeng Rehabil. 2013 Sep 28;10:102. doi: 10.1186/1743-0003-10-102.
5
Integration of Tracking, Re-Identification, and Gesture Recognition for Facilitating Human-Robot Interaction.用于促进人机交互的跟踪、重新识别和手势识别的集成。
Sensors (Basel). 2024 Jul 25;24(15):4850. doi: 10.3390/s24154850.
6
A Multimodal Emotional Human-Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication.一种用于从事双向通信的社交机器人的多模态情感人机交互架构。
IEEE Trans Cybern. 2021 Dec;51(12):5954-5968. doi: 10.1109/TCYB.2020.2974688. Epub 2021 Dec 22.
7
Design of Service Robot Based on User Emotion Recognition and Environmental Monitoring.基于用户情绪识别和环境监测的服务机器人设计。
J Environ Public Health. 2022 Oct 4;2022:3517995. doi: 10.1155/2022/3517995. eCollection 2022.
8
Using "human state aware" robots to enhance physical human-robot interaction in a cooperative scenario.利用“具备人类状态感知能力”的机器人来增强协作场景中的物理人机交互。
Comput Methods Programs Biomed. 2013 Nov;112(2):250-9. doi: 10.1016/j.cmpb.2013.02.003. Epub 2013 Mar 20.
9
Promoting Interactions Between Humans and Robots Using Robotic Emotional Behavior.利用机器人情感行为促进人机交互。
IEEE Trans Cybern. 2016 Dec;46(12):2911-2923. doi: 10.1109/TCYB.2015.2492999. Epub 2015 Nov 2.
10
Intelligent lead: a novel HRI sensor for guide robots.智能探头:一种用于导览机器人的新型 HRI 传感器。
Sensors (Basel). 2012;12(6):8301-18. doi: 10.3390/s120608301. Epub 2012 Jun 14.

引用本文的文献

1
LLM-based robot personality simulation and cognitive system.基于大语言模型的机器人个性模拟与认知系统。
Sci Rep. 2025 May 16;15(1):16993. doi: 10.1038/s41598-025-01528-8.

本文引用的文献

1
Updating design guidelines for cognitive ergonomics in human-centred collaborative robotics applications: An expert survey.更新以人为中心的协作机器人应用中的认知工效学设计指南:专家调查。
Appl Ergon. 2024 May;117:104246. doi: 10.1016/j.apergo.2024.104246. Epub 2024 Feb 13.
2
Understanding human intention by connecting perception and action learning in artificial agents.通过在人工智能主体中连接感知与行动学习来理解人类意图。
Neural Netw. 2017 Aug;92:29-38. doi: 10.1016/j.neunet.2017.01.009. Epub 2017 Feb 11.