• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于眼动追踪的机器人仿生眼三维三角测量。

Eye Gaze Based 3D Triangulation for Robotic Bionic Eyes.

机构信息

School of Mechatronical Engineering, Beijing Institute of Technology, Beijing 100081, China.

AIPARK, Zhangjiakou 075000, China.

出版信息

Sensors (Basel). 2020 Sep 15;20(18):5271. doi: 10.3390/s20185271.

DOI:10.3390/s20185271
PMID:32942655
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7571035/
Abstract

Three-dimensional (3D) triangulation based on active binocular vision has increasing amounts of applications in computer vision and robotics. An active binocular vision system with non-fixed cameras needs to calibrate the stereo extrinsic parameters online to perform 3D triangulation. However, the accuracy of stereo extrinsic parameters and disparity have a significant impact on 3D triangulation precision. We propose a novel eye gaze based 3D triangulation method that does not use stereo extrinsic parameters directly in order to reduce the impact. Instead, we drive both cameras to gaze at a 3D spatial point at the optical center through visual servoing. Subsequently, we can obtain the 3D coordinates of through the intersection of the two optical axes of both cameras. We have performed experiments to compare with previous disparity based work, named the integrated two-pose calibration (ITPC) method, using our robotic bionic eyes. The experiments show that our method achieves comparable results with ITPC.

摘要

基于主动双目视觉的三维(3D)三角测量在计算机视觉和机器人技术中有越来越多的应用。具有非固定相机的主动双目视觉系统需要在线校准立体外参数以执行 3D 三角测量。然而,立体外参数和视差的准确性对 3D 三角测量精度有重大影响。我们提出了一种新的基于眼睛注视的 3D 三角测量方法,该方法不直接使用立体外参数,以降低影响。相反,我们通过视觉伺服控制两个相机将注视点驱动到光学中心处的三维空间点。然后,我们可以通过两个相机的两个光轴的交点来获得该点的 3D 坐标。我们已经使用我们的机器人仿生眼进行了实验,与之前基于视差的工作(称为集成双姿态标定(ITPC)方法)进行了比较。实验表明,我们的方法与 ITPC 方法的结果相当。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/227a58f666c7/sensors-20-05271-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/a216a5e346e7/sensors-20-05271-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/276dd0871aef/sensors-20-05271-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/52684f2c11da/sensors-20-05271-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/de79dd831e9e/sensors-20-05271-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/392c56015d8c/sensors-20-05271-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/ed2add403778/sensors-20-05271-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/e859e6c36eb2/sensors-20-05271-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/2ce049afb569/sensors-20-05271-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/af0840883423/sensors-20-05271-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/d8aaae6953d2/sensors-20-05271-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/99f781a6452a/sensors-20-05271-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/cac23c4044ef/sensors-20-05271-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/ede10f32f706/sensors-20-05271-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/d3235498a4f1/sensors-20-05271-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/227a58f666c7/sensors-20-05271-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/a216a5e346e7/sensors-20-05271-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/276dd0871aef/sensors-20-05271-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/52684f2c11da/sensors-20-05271-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/de79dd831e9e/sensors-20-05271-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/392c56015d8c/sensors-20-05271-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/ed2add403778/sensors-20-05271-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/e859e6c36eb2/sensors-20-05271-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/2ce049afb569/sensors-20-05271-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/af0840883423/sensors-20-05271-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/d8aaae6953d2/sensors-20-05271-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/99f781a6452a/sensors-20-05271-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/cac23c4044ef/sensors-20-05271-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/ede10f32f706/sensors-20-05271-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/d3235498a4f1/sensors-20-05271-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e303/7571035/227a58f666c7/sensors-20-05271-g015.jpg

相似文献

1
Eye Gaze Based 3D Triangulation for Robotic Bionic Eyes.基于眼动追踪的机器人仿生眼三维三角测量。
Sensors (Basel). 2020 Sep 15;20(18):5271. doi: 10.3390/s20185271.
2
An active system for visually-guided reaching in 3D across binocular fixations.一种用于在双眼注视下进行三维视觉引导伸手动作的主动系统。
ScientificWorldJournal. 2014 Feb 4;2014:179391. doi: 10.1155/2014/179391. eCollection 2014.
3
Gaze Point Tracking Based on a Robotic Body-Head-Eye Coordination Method.基于机器人身-头-眼协调方法的注视点跟踪。
Sensors (Basel). 2023 Jul 11;23(14):6299. doi: 10.3390/s23146299.
4
3D reconstruction method based on second-order semiglobal stereo matching and fast point positioning Delaunay triangulation.基于二阶半全局立体匹配和快速点定位 Delaunay 三角剖分的 3D 重建方法。
PLoS One. 2022 Jan 25;17(1):e0260466. doi: 10.1371/journal.pone.0260466. eCollection 2022.
5
On the tridimensional estimation of the gaze point by a stereoscopic wearable eye tracker.基于立体可穿戴式眼动追踪器的注视点三维估计
Annu Int Conf IEEE Eng Med Biol Soc. 2015;2015:2283-6. doi: 10.1109/EMBC.2015.7318848.
6
An auxiliary gaze point estimation method based on facial normal.一种基于面部法线的辅助注视点估计方法。
Pattern Anal Appl. 2016 Aug;19(3):611-620. doi: 10.1007/s10044-014-0407-5. Epub 2014 Aug 21.
7
A Novel Method for Estimating Free Space 3D Point-of-Regard Using Pupillary Reflex and Line-of-Sight Convergence Points.利用瞳孔反射和视线汇聚点估计自由空间三维注视点的新方法。
Sensors (Basel). 2018 Jul 15;18(7):2292. doi: 10.3390/s18072292.
8
Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras.基于瞳孔角膜反射法的免头戴、远程眼动追踪系统,使用两个经过立体标定的摄像机,可轻松进行标定。
IEEE Trans Biomed Eng. 2013 Oct;60(10):2952-60. doi: 10.1109/TBME.2013.2266478. Epub 2013 Jun 6.
9
A real-time gaze position estimation method based on a 3-D eye model.一种基于三维眼睛模型的实时注视位置估计方法。
IEEE Trans Syst Man Cybern B Cybern. 2007 Feb;37(1):199-212. doi: 10.1109/tsmcb.2006.883426.
10
Underwater Target Detection and 3D Reconstruction System Based on Binocular Vision.基于双目视觉的水下目标检测与三维重建系统。
Sensors (Basel). 2018 Oct 21;18(10):3570. doi: 10.3390/s18103570.

引用本文的文献

1
The Design and Control of a Biomimetic Binocular Cooperative Perception System Inspired by the Eye Gaze Mechanism.受眼动注视机制启发的仿生双目协同感知系统的设计与控制
Biomimetics (Basel). 2024 Jan 24;9(2):69. doi: 10.3390/biomimetics9020069.
2
Gaze Point Tracking Based on a Robotic Body-Head-Eye Coordination Method.基于机器人身-头-眼协调方法的注视点跟踪。
Sensors (Basel). 2023 Jul 11;23(14):6299. doi: 10.3390/s23146299.
3
Stereo Image Matching Using Adaptive Morphological Correlation.基于自适应形态学相关的立体图像匹配。

本文引用的文献

1
A Method for Extrinsic Parameter Calibration of Rotating Binocular Stereo Vision Using a Single Feature Point.基于单个特征点的旋转双目立体视觉外部参数标定方法
Sensors (Basel). 2018 Oct 29;18(11):3666. doi: 10.3390/s18113666.
2
Precise calibration of binocular vision system used for vision measurement.用于视觉测量的双目视觉系统的精确校准。
Opt Express. 2014 Apr 21;22(8):9134-49. doi: 10.1364/OE.22.009134.
3
Muecas: a multi-sensor robotic head for affective human robot interaction and imitation.Muecas:一款用于情感人机交互与模仿的多传感器机器人头部。
Sensors (Basel). 2022 Nov 22;22(23):9050. doi: 10.3390/s22239050.
Sensors (Basel). 2014 Apr 28;14(5):7711-37. doi: 10.3390/s140507711.
4
Continuous stereo self-calibration by camera parameter tracking.通过相机参数跟踪实现连续立体自校准。
IEEE Trans Image Process. 2009 Jul;18(7):1536-50. doi: 10.1109/TIP.2009.2017824. Epub 2009 Jun 2.
5
A new active visual system for humanoid robots.一种用于类人机器人的新型主动视觉系统。
IEEE Trans Syst Man Cybern B Cybern. 2008 Apr;38(2):320-30. doi: 10.1109/TSMCB.2007.912082.