• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用无基础设施的头眼注视接口实现笛卡尔空间中的稳健机器人控制。

Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface.

机构信息

Group of Sensors and Actuators, Department of Electrical Engineering and Applied Sciences, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, Germany.

出版信息

Sensors (Basel). 2021 Mar 5;21(5):1798. doi: 10.3390/s21051798.

DOI:10.3390/s21051798
PMID:33807599
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7962065/
Abstract

This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (ORB-SLAM 2) for RGB-D cameras with a Magnetic Angular rate Gravity (MARG)-sensor filter. The data fusion process is designed to dynamically switch between magnetic, inertial and visual heading sources to enable robust orientation estimation under various disturbances, e.g., magnetic disturbances or degraded visual sensor data. The interface furthermore delivers accurate eye- and head-gaze vectors to enable precise robot end effector (EFF) positioning and employs a head motion mapping technique to effectively control the robots end effector orientation. An experimental proof of concept demonstrates that the proposed interface and its data fusion process generate reliable and robust pose estimation. The three-dimensional head- and eye-gaze position estimation pipeline delivers a mean Euclidean error of 19.0±15.7 mm for head-gaze and 27.4±21.8 mm for eye-gaze at a distance of 0.3-1.1 m to the user. This indicates that the proposed interface offers a precise control mechanism for hands-free and full six degree of freedom (DoF) robot teleoperation in Cartesian space by head- or eye-gaze and head motion.

摘要

本文提出了一种轻量级、无基础设施的头戴式接口,用于使用头眼注视在笛卡尔空间中进行稳健和实时的机器人控制。该接口的总重量仅为 162 克。它结合了最先进的 RGB-D 相机的视觉同时定位和映射算法(ORB-SLAM 2)和磁角速率重力(MARG)传感器滤波器。数据融合过程旨在在各种干扰下(例如磁场干扰或视觉传感器数据降级)动态切换磁、惯性和视觉航向源,以实现稳健的方向估计。该接口还提供准确的头眼注视向量,以实现机器人末端执行器(EFF)的精确定位,并采用头部运动映射技术来有效控制机器人末端执行器的方向。实验验证了概念证明,所提出的接口及其数据融合过程可生成可靠且稳健的姿态估计。三维头眼注视位置估计管道在距离用户 0.3-1.1 米处对头眼注视和眼注视的平均欧几里得误差分别为 19.0±15.7 毫米和 27.4±21.8 毫米。这表明,所提出的接口通过头眼注视和头部运动提供了一种用于笛卡尔空间中免提和全六自由度(DoF)机器人遥操作的精确控制机制。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/bc1b65f240d7/sensors-21-01798-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/506b132f7153/sensors-21-01798-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/51b4199c6159/sensors-21-01798-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/818ab11f1ce7/sensors-21-01798-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/f712d590831b/sensors-21-01798-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/593dd342868a/sensors-21-01798-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/a558d2976262/sensors-21-01798-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/f7843f6cb123/sensors-21-01798-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/e9f46ef90fb1/sensors-21-01798-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/3b005a19f662/sensors-21-01798-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/f4f25bec910b/sensors-21-01798-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/6a13e8f50af5/sensors-21-01798-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/d5ac1da6956d/sensors-21-01798-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/75caad91ce22/sensors-21-01798-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/262770d79f67/sensors-21-01798-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/bc1b65f240d7/sensors-21-01798-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/506b132f7153/sensors-21-01798-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/51b4199c6159/sensors-21-01798-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/818ab11f1ce7/sensors-21-01798-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/f712d590831b/sensors-21-01798-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/593dd342868a/sensors-21-01798-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/a558d2976262/sensors-21-01798-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/f7843f6cb123/sensors-21-01798-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/e9f46ef90fb1/sensors-21-01798-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/3b005a19f662/sensors-21-01798-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/f4f25bec910b/sensors-21-01798-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/6a13e8f50af5/sensors-21-01798-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/d5ac1da6956d/sensors-21-01798-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/75caad91ce22/sensors-21-01798-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/262770d79f67/sensors-21-01798-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85cc/7962065/bc1b65f240d7/sensors-21-01798-g015.jpg

相似文献

1
Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface.使用无基础设施的头眼注视接口实现笛卡尔空间中的稳健机器人控制。
Sensors (Basel). 2021 Mar 5;21(5):1798. doi: 10.3390/s21051798.
2
Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive Robots.基于头眼运动的辅助机器人控制接口性能分析。
Sensors (Basel). 2020 Dec 14;20(24):7162. doi: 10.3390/s20247162.
3
SteadEye-Head-Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data.基于眼动追踪数据的 SteadEye-Head-Improving MARG 传感器改善头姿测量
Sensors (Basel). 2020 May 12;20(10):2759. doi: 10.3390/s20102759.
4
Gaze Point Tracking Based on a Robotic Body-Head-Eye Coordination Method.基于机器人身-头-眼协调方法的注视点跟踪。
Sensors (Basel). 2023 Jul 11;23(14):6299. doi: 10.3390/s23146299.
5
Continuous Driver's Gaze Zone Estimation Using RGB-D Camera.基于 RGB-D 相机的驾驶员注视区域连续估计
Sensors (Basel). 2019 Mar 14;19(6):1287. doi: 10.3390/s19061287.
6
High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems.高效重标定的高精度 3D 注视估计用于头戴式注视跟踪系统。
Sensors (Basel). 2022 Jun 8;22(12):4357. doi: 10.3390/s22124357.
7
A novel method for measuring gaze orientation in space in unrestrained head conditions.一种在头部无约束条件下测量空间注视方向的新方法。
J Vis. 2013 Jul 31;13(8):28. doi: 10.1167/13.8.28.
8
Gaze strategies during linear motion in head-free humans.无头部约束的人类在直线运动过程中的注视策略。
J Neurophysiol. 1994 Nov;72(5):2451-66. doi: 10.1152/jn.1994.72.5.2451.
9
Vestibuloocular reflex inhibition and gaze saccade control characteristics during eye-head orientation in humans.人类眼-头定向过程中的前庭眼反射抑制和注视扫视控制特征
J Neurophysiol. 1988 Mar;59(3):997-1013. doi: 10.1152/jn.1988.59.3.997.
10
Head motion-corrected eye gaze tracking with the da Vinci surgical system.达芬奇手术系统的头部运动校正眼动追踪。
Int J Comput Assist Radiol Surg. 2024 Jul;19(7):1459-1467. doi: 10.1007/s11548-024-03173-4. Epub 2024 Jun 18.

引用本文的文献

1
GMM-HMM-Based Eye Movement Classification for Efficient and Intuitive Dynamic Human-Computer Interaction Systems.基于高斯混合模型-隐马尔可夫模型的眼动分类用于高效直观的动态人机交互系统
J Eye Mov Res. 2025 Jul 9;18(4):28. doi: 10.3390/jemr18040028. eCollection 2025 Aug.
2
Collaborative Robot Control Based on Human Gaze Tracking.基于人类视线跟踪的协作机器人控制
Sensors (Basel). 2025 May 14;25(10):3103. doi: 10.3390/s25103103.
3
Optimizing DG Handling: Designing an Immersive MRsafe Training Program.优化 DG 处理:设计沉浸式 MRsafe 培训计划。

本文引用的文献

1
SteadEye-Head-Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data.基于眼动追踪数据的 SteadEye-Head-Improving MARG 传感器改善头姿测量
Sensors (Basel). 2020 May 12;20(10):2759. doi: 10.3390/s20102759.
2
AMiCUS-A Head Motion-Based Interface for Control of an Assistive Robot.AMiCUS-A 基于头部运动的辅助机器人控制接口。
Sensors (Basel). 2019 Jun 25;19(12):2836. doi: 10.3390/s19122836.
3
Head Motion and Head Gesture-Based Robot Control: A Usability Study.基于头部运动和头部姿势的机器人控制:一项可用性研究。
Sensors (Basel). 2024 Oct 30;24(21):6972. doi: 10.3390/s24216972.
4
A scoping review of gaze and eye tracking-based control methods for assistive robotic arms.基于注视和眼动追踪的辅助机器人手臂控制方法的范围综述。
Front Robot AI. 2024 Feb 19;11:1326670. doi: 10.3389/frobt.2024.1326670. eCollection 2024.
5
Assistance Robotics and Sensors.辅助机器人与传感器。
Sensors (Basel). 2023 Apr 26;23(9):4286. doi: 10.3390/s23094286.
6
Restoration of complex movement in the paralyzed upper limb.瘫痪上肢复杂运动的恢复。
J Neural Eng. 2022 Jul 1;19(4). doi: 10.1088/1741-2552/ac7ad7.
7
CyberEye: New Eye-Tracking Interfaces for Assessment and Modulation of Cognitive Functions beyond the Brain.赛博眼:超越大脑的认知功能评估和调节的新型眼动追踪接口。
Sensors (Basel). 2021 Nov 16;21(22):7605. doi: 10.3390/s21227605.
IEEE Trans Neural Syst Rehabil Eng. 2018 Jan;26(1):161-170. doi: 10.1109/TNSRE.2017.2765362.
4
Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking.利用双目眼动追踪实现机器人辅助人体伸展的自由三维端点控制。
IEEE Int Conf Rehabil Robot. 2017 Jul;2017:1049-1054. doi: 10.1109/ICORR.2017.8009388.
5
Effect of local magnetic field disturbances on inertial measurement units accuracy.局部磁场干扰对惯性测量单元精度的影响。
Appl Ergon. 2017 Sep;63:123-132. doi: 10.1016/j.apergo.2017.04.011. Epub 2017 Apr 26.
6
3-D-Gaze-Based Robotic Grasping Through Mimicking Human Visuomotor Function for People With Motion Impairments.基于 3D 注视的机器人抓取,通过模仿运动障碍患者的视动功能实现。
IEEE Trans Biomed Eng. 2017 Dec;64(12):2824-2835. doi: 10.1109/TBME.2017.2677902. Epub 2017 Mar 3.
7
Estimation of IMU and MARG orientation using a gradient descent algorithm.使用梯度下降算法估计惯性测量单元(IMU)和微型姿态参考系统(MARG)的方向。
IEEE Int Conf Rehabil Robot. 2011;2011:5975346. doi: 10.1109/ICORR.2011.5975346.