• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用距离图像进行机器人的视觉控制。

Visual control of robots using range images.

机构信息

Physics, Systems Engineering and Signal Theory Department, University of Alicante, PO Box 99, Alicante 03080, Spain.

出版信息

Sensors (Basel). 2010;10(8):7303-22. doi: 10.3390/s100807303. Epub 2010 Aug 4.

DOI:10.3390/s100807303
PMID:22163604
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC3231188/
Abstract

In the last years, 3D-vision systems based on the time-of-flight (ToF) principle have gained more importance in order to obtain 3D information from the workspace. In this paper, an analysis of the use of 3D ToF cameras to guide a robot arm is performed. To do so, an adaptive method to simultaneous visual servo control and camera calibration is presented. Using this method a robot arm is guided by using range information obtained from a ToF camera. Furthermore, the self-calibration method obtains the adequate integration time to be used by the range camera in order to precisely determine the depth information.

摘要

在过去几年中,基于飞行时间 (ToF) 原理的 3D 视觉系统在获取工作空间 3D 信息方面变得越来越重要。本文分析了使用 3D ToF 相机来引导机械臂的方法。为此,提出了一种自适应的视觉伺服控制和相机标定方法。使用这种方法,可以通过 ToF 相机获得的距离信息来引导机械臂。此外,自校准方法获得适当的积分时间,以便距离相机精确地确定深度信息。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/e3755bfb7b40/sensors-10-07303f18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/27494f651d62/sensors-10-07303f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/0f5b88ccc453/sensors-10-07303f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/f87d74192b13/sensors-10-07303f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/8e34c85824d7/sensors-10-07303f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/3bce9bb6876a/sensors-10-07303f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/5ae9b18f7a64/sensors-10-07303f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/4f8bbcf39999/sensors-10-07303f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/f8ab330819ee/sensors-10-07303f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/40baf689bfa4/sensors-10-07303f9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/548cfc796d31/sensors-10-07303f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/7f9df066e2d0/sensors-10-07303f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/6394909afd77/sensors-10-07303f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/140be1f28ce0/sensors-10-07303f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/fd027217b479/sensors-10-07303f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/d9c3c88769f1/sensors-10-07303f15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/76c63f9adb62/sensors-10-07303f16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/5376777a49ba/sensors-10-07303f17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/e3755bfb7b40/sensors-10-07303f18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/27494f651d62/sensors-10-07303f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/0f5b88ccc453/sensors-10-07303f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/f87d74192b13/sensors-10-07303f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/8e34c85824d7/sensors-10-07303f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/3bce9bb6876a/sensors-10-07303f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/5ae9b18f7a64/sensors-10-07303f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/4f8bbcf39999/sensors-10-07303f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/f8ab330819ee/sensors-10-07303f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/40baf689bfa4/sensors-10-07303f9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/548cfc796d31/sensors-10-07303f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/7f9df066e2d0/sensors-10-07303f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/6394909afd77/sensors-10-07303f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/140be1f28ce0/sensors-10-07303f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/fd027217b479/sensors-10-07303f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/d9c3c88769f1/sensors-10-07303f15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/76c63f9adb62/sensors-10-07303f16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/5376777a49ba/sensors-10-07303f17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9a22/3231188/e3755bfb7b40/sensors-10-07303f18.jpg

相似文献

1
Visual control of robots using range images.使用距离图像进行机器人的视觉控制。
Sensors (Basel). 2010;10(8):7303-22. doi: 10.3390/s100807303. Epub 2010 Aug 4.
2
An intelligent space for mobile robot localization using a multi-camera system.一种使用多摄像头系统进行移动机器人定位的智能空间。
Sensors (Basel). 2014 Aug 15;14(8):15039-64. doi: 10.3390/s140815039.
3
Obstacle classification and 3D measurement in unstructured environments based on ToF cameras.基于飞行时间(ToF)相机的非结构化环境中的障碍物分类与三维测量
Sensors (Basel). 2014 Jun 18;14(6):10753-82. doi: 10.3390/s140610753.
4
Development of a stereo vision measurement system for a 3D three-axial pneumatic parallel mechanism robot arm.开发用于三维三轴气动并联机构机器人臂的立体视觉测量系统。
Sensors (Basel). 2011;11(2):2257-81. doi: 10.3390/s110202257. Epub 2011 Feb 21.
5
HOPIS: hybrid omnidirectional and perspective imaging system for mobile robots.HOPIS:用于移动机器人的混合全向和透视成像系统。
Sensors (Basel). 2014 Sep 4;14(9):16508-31. doi: 10.3390/s140916508.
6
Real-time tissue tracking with B-mode ultrasound using speckle and visual servoing.使用散斑和视觉伺服技术的B超实时组织跟踪
Med Image Comput Comput Assist Interv. 2007;10(Pt 2):1-8. doi: 10.1007/978-3-540-75759-7_1.
7
A single-camera method for three-dimensional video imaging.一种用于三维视频成像的单相机方法。
J Neurosci Methods. 2002 Oct 15;120(1):65-83. doi: 10.1016/s0165-0270(02)00191-7.
8
Homography-based visual servo regulation of mobile robots.基于单应性的移动机器人视觉伺服调节
IEEE Trans Syst Man Cybern B Cybern. 2005 Oct;35(5):1041-50. doi: 10.1109/tsmcb.2005.850155.
9
Dense range map reconstruction from a versatile robotic sensor system with an active trinocular vision and a passive binocular vision.基于具有主动三目视觉和被动双目视觉的通用机器人传感器系统的密集距离图重建。
Appl Opt. 2008 Apr 10;47(11):1927-39. doi: 10.1364/ao.47.001927.
10
Time-of-flight-assisted Kinect camera-based people detection for intuitive human robot cooperation in the surgical operating room.基于飞行时间辅助Kinect相机的人员检测,用于手术室中直观的人机协作。
Int J Comput Assist Radiol Surg. 2016 Jul;11(7):1329-45. doi: 10.1007/s11548-015-1318-7. Epub 2015 Nov 14.

引用本文的文献

1
Robust Kalman filtering cooperated Elman neural network learning for vision-sensing-based robotic manipulation with global stability.基于视觉感知的机器人操作的鲁棒卡尔曼滤波协同 Elman 神经网络学习及全局稳定性
Sensors (Basel). 2013 Oct 8;13(10):13464-86. doi: 10.3390/s131013464.

本文引用的文献

1
Real-time markerless tracking for augmented reality: the virtual visual servoing framework.用于增强现实的实时无标记跟踪:虚拟视觉伺服框架。
IEEE Trans Vis Comput Graph. 2006 Jul-Aug;12(4):615-28. doi: 10.1109/TVCG.2006.78.