• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用冗余架构的鲁棒且协作的基于图像的视觉伺服系统。

Robust and cooperative image-based visual servoing system using a redundant architecture.

机构信息

Edificio Quorum V, Miguel Hernandez University, Avda. de la Universidad S/N, 03202 Elche, Spain.

出版信息

Sensors (Basel). 2011;11(12):11885-900. doi: 10.3390/s111211885. Epub 2011 Dec 20.

DOI:10.3390/s111211885
PMID:22247698
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC3252015/
Abstract

The reliability and robustness of image-based visual servoing systems is still unsolved by the moment. In order to address this issue, a redundant and cooperative 2D visual servoing system based on the information provided by two cameras in eye-in-hand/eye-to-hand configurations is proposed. Its control law has been defined to assure that the whole system is stable if each subsystem is stable and to allow avoiding typical problems of image-based visual servoing systems like task singularities, features extraction errors, disappearance of image features, local minima, etc. Experimental results with an industrial robot manipulator based on Schunk modular motors to demonstrate the stability, performance and robustness of the proposed system are presented.

摘要

基于视觉的视觉伺服系统的可靠性和鲁棒性至今仍未得到解决。为了解决这个问题,提出了一种基于眼在手/眼到手配置中两个摄像机提供的信息的冗余和协作的二维视觉伺服系统。已经定义了其控制律,以确保如果每个子系统稳定,则整个系统是稳定的,并允许避免基于视觉的视觉伺服系统的典型问题,如任务奇点、特征提取误差、图像特征消失、局部极小值等。提出了一种基于雄克模块电机的工业机器人机械手的实验结果,以证明所提出系统的稳定性、性能和鲁棒性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/e17e76e7798d/sensors-11-11885f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/96607364f0b7/sensors-11-11885f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/f55ad322e4a2/sensors-11-11885f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/b1069750bd97/sensors-11-11885f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/f8dead038699/sensors-11-11885f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/62612bad4e4a/sensors-11-11885f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/fd0057f23959/sensors-11-11885f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/2fcbda292858/sensors-11-11885f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/cfbe76db81de/sensors-11-11885f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/060ff2c463b3/sensors-11-11885f9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/1435b5f93c78/sensors-11-11885f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/d78169a36c48/sensors-11-11885f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/04991e371cb5/sensors-11-11885f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/9300f538702e/sensors-11-11885f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/e17e76e7798d/sensors-11-11885f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/96607364f0b7/sensors-11-11885f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/f55ad322e4a2/sensors-11-11885f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/b1069750bd97/sensors-11-11885f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/f8dead038699/sensors-11-11885f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/62612bad4e4a/sensors-11-11885f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/fd0057f23959/sensors-11-11885f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/2fcbda292858/sensors-11-11885f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/cfbe76db81de/sensors-11-11885f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/060ff2c463b3/sensors-11-11885f9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/1435b5f93c78/sensors-11-11885f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/d78169a36c48/sensors-11-11885f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/04991e371cb5/sensors-11-11885f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/9300f538702e/sensors-11-11885f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/53e7/3252015/e17e76e7798d/sensors-11-11885f14.jpg

相似文献

1
Robust and cooperative image-based visual servoing system using a redundant architecture.使用冗余架构的鲁棒且协作的基于图像的视觉伺服系统。
Sensors (Basel). 2011;11(12):11885-900. doi: 10.3390/s111211885. Epub 2011 Dec 20.
2
Robust uncalibrated visual servoing control based on disturbance observer.基于干扰观测器的鲁棒无标定视觉伺服控制
ISA Trans. 2015 Nov;59:193-204. doi: 10.1016/j.isatra.2015.07.003. Epub 2015 Aug 29.
3
A data-driven acceleration-level scheme for image-based visual servoing of manipulators with unknown structure.一种用于结构未知的机器人基于图像视觉伺服的数据驱动加速水平方案。
Front Neurorobot. 2024 Mar 20;18:1380430. doi: 10.3389/fnbot.2024.1380430. eCollection 2024.
4
Robot manipulator visual servoing based on image moments and improved firefly optimization algorithm-based extreme learning machine.基于图像矩和改进的基于萤火虫优化算法的极限学习机的机器人操纵器视觉伺服控制
ISA Trans. 2023 Dec;143:188-204. doi: 10.1016/j.isatra.2023.10.010. Epub 2023 Oct 18.
5
Design of a Gough-Stewart Platform Based on Visual Servoing Controller.基于视觉伺服控制器的Gough-Stewart平台设计
Sensors (Basel). 2022 Mar 25;22(7):2523. doi: 10.3390/s22072523.
6
Hybrid Visual-Ranging Servoing for Positioning Based on Image and Measurement Features.
IEEE Trans Cybern. 2023 Jul;53(7):4270-4279. doi: 10.1109/TCYB.2022.3160758. Epub 2023 Jun 15.
7
Predictive Control-Based Completeness Analysis and Global Calibration of Robot Vision Features.基于预测控制的机器人视觉特征完整性分析与全局标定。
Comput Intell Neurosci. 2021 Dec 9;2021:7241659. doi: 10.1155/2021/7241659. eCollection 2021.
8
CLFs-based optimization control for a class of constrained visual servoing systems.基于分类学习滤波器的一类约束视觉伺服系统优化控制
ISA Trans. 2017 Mar;67:507-514. doi: 10.1016/j.isatra.2016.11.018. Epub 2016 Dec 7.
9
Model predictive control for constrained robot manipulator visual servoing tuned by reinforcement learning.基于强化学习整定的约束机器人视觉伺服模型预测控制。
Math Biosci Eng. 2023 Apr 10;20(6):10495-10513. doi: 10.3934/mbe.2023463.
10
Robust Hybrid Visual Servoing of Omnidirectional Mobile Manipulator With Kinematic Uncertainties Using a Single Camera.基于单目相机的具有运动学不确定性的全方位移动机械手的鲁棒混合视觉伺服控制
IEEE Trans Cybern. 2024 May;54(5):2824-2837. doi: 10.1109/TCYB.2023.3238820. Epub 2024 Apr 16.

引用本文的文献

1
Perception-Action Coupling Target Tracking Control for a Snake Robot via Reinforcement Learning.基于强化学习的蛇形机器人感知-动作耦合目标跟踪控制
Front Neurorobot. 2020 Oct 20;14:591128. doi: 10.3389/fnbot.2020.591128. eCollection 2020.
2
Adaptive Kinematic Control of a Robotic Venipuncture Device Based on Stereo Vision, Ultrasound, and Force Guidance.基于立体视觉、超声和力引导的机器人静脉穿刺装置的自适应运动控制
IEEE Trans Ind Electron. 2017 Feb;64(2):1626-1635. doi: 10.1109/TIE.2016.2557306. Epub 2016 Apr 21.
3
Robust Kalman filtering cooperated Elman neural network learning for vision-sensing-based robotic manipulation with global stability.
基于视觉感知的机器人操作的鲁棒卡尔曼滤波协同 Elman 神经网络学习及全局稳定性
Sensors (Basel). 2013 Oct 8;13(10):13464-86. doi: 10.3390/s131013464.