• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种使用多摄像头系统进行移动机器人定位的智能空间。

An intelligent space for mobile robot localization using a multi-camera system.

作者信息

Rampinelli Mariana, Covre Vitor Buback, de Queiroz Felippe Mendonça, Vassallo Raquel Frizera, Bastos-Filho Teodiano Freire, Mazo Manuel

机构信息

Department of Electro-Mechanics, Federal Institute of Education, Science and Technology of Espirito Santo (IFES), Estrada da Tartaruga, s/n, 29215-090, Guarapari, Espirito Santo, Brazil.

Department of Electrical Engineering, Federal University of Espirito Santo (UFES), Av. Fernando Ferrari, s/n, 29075-910, Vitoria, Espirito Santo, Brazil.

出版信息

Sensors (Basel). 2014 Aug 15;14(8):15039-64. doi: 10.3390/s140815039.

DOI:10.3390/s140815039
PMID:25196009
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC4179071/
Abstract

This paper describes an intelligent space, whose objective is to localize and control robots or robotic wheelchairs to help people. Such an intelligent space has 11 cameras distributed in two laboratories and a corridor. The cameras are fixed in the environment, and image capturing is done synchronously. The system was programmed as a client/server with TCP/IP connections, and a communication protocol was defined. The client coordinates the activities inside the intelligent space, and the servers provide the information needed for that. Once the cameras are used for localization, they have to be properly calibrated. Therefore, a calibration method for a multi-camera network is also proposed in this paper. A robot is used to move a calibration pattern throughout the field of view of the cameras. Then, the captured images and the robot odometry are used for calibration. As a result, the proposed algorithm provides a solution for multi-camera calibration and robot localization at the same time. The intelligent space and the calibration method were evaluated under different scenarios using computer simulations and real experiments. The results demonstrate the proper functioning of the intelligent space and validate the multi-camera calibration method, which also improves robot localization.

摘要

本文描述了一个智能空间,其目标是对机器人或机器人轮椅进行定位和控制以帮助人们。这样一个智能空间有11个摄像头分布在两个实验室和一条走廊中。摄像头固定在环境中,图像采集同步进行。该系统被编程为具有TCP/IP连接的客户端/服务器,并定义了一种通信协议。客户端协调智能空间内的活动,服务器提供所需信息。一旦摄像头用于定位,就必须进行适当校准。因此,本文还提出了一种多摄像头网络的校准方法。使用一个机器人在摄像头的视野范围内移动校准图案。然后,将捕获的图像和机器人里程计用于校准。结果,所提出的算法同时为多摄像头校准和机器人定位提供了一种解决方案。使用计算机模拟和实际实验在不同场景下对智能空间和校准方法进行了评估。结果证明了智能空间的正常运行,并验证了多摄像头校准方法,该方法也改进了机器人定位。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/6d86196d967c/sensors-14-15039f18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/3d90299ce7a6/sensors-14-15039f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/52422f986459/sensors-14-15039f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/7b1d8705982f/sensors-14-15039f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/dc91046ffbf9/sensors-14-15039f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/c1c3b8162444/sensors-14-15039f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/0673504c0755/sensors-14-15039f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/d34599c0b4b3/sensors-14-15039f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/c1201dbe5637/sensors-14-15039f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/873a880ff39e/sensors-14-15039f9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/a2fcf6d92fe9/sensors-14-15039f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/14c99a064411/sensors-14-15039f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/8f6affea5417/sensors-14-15039f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/a4a620ff3721/sensors-14-15039f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/b277070b3868/sensors-14-15039f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/e153520b9afe/sensors-14-15039f15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/922c95786a1a/sensors-14-15039f16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/287ccc6cab34/sensors-14-15039f17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/6d86196d967c/sensors-14-15039f18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/3d90299ce7a6/sensors-14-15039f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/52422f986459/sensors-14-15039f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/7b1d8705982f/sensors-14-15039f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/dc91046ffbf9/sensors-14-15039f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/c1c3b8162444/sensors-14-15039f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/0673504c0755/sensors-14-15039f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/d34599c0b4b3/sensors-14-15039f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/c1201dbe5637/sensors-14-15039f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/873a880ff39e/sensors-14-15039f9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/a2fcf6d92fe9/sensors-14-15039f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/14c99a064411/sensors-14-15039f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/8f6affea5417/sensors-14-15039f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/a4a620ff3721/sensors-14-15039f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/b277070b3868/sensors-14-15039f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/e153520b9afe/sensors-14-15039f15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/922c95786a1a/sensors-14-15039f16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/287ccc6cab34/sensors-14-15039f17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8937/4179071/6d86196d967c/sensors-14-15039f18.jpg

相似文献

1
An intelligent space for mobile robot localization using a multi-camera system.一种使用多摄像头系统进行移动机器人定位的智能空间。
Sensors (Basel). 2014 Aug 15;14(8):15039-64. doi: 10.3390/s140815039.
2
Visual control of robots using range images.使用距离图像进行机器人的视觉控制。
Sensors (Basel). 2010;10(8):7303-22. doi: 10.3390/s100807303. Epub 2010 Aug 4.
3
Robot evolutionary localization based on attentive visual short-term memory.基于注意视觉短期记忆的机器人进化定位。
Sensors (Basel). 2013 Jan 21;13(1):1268-99. doi: 10.3390/s130101268.
4
Real-time multiple human perception with color-depth cameras on a mobile robot.移动机器人上的彩色深度相机的实时多人感知。
IEEE Trans Cybern. 2013 Oct;43(5):1429-41. doi: 10.1109/TCYB.2013.2275291. Epub 2013 Aug 21.
5
Multi-Robot 2.5D Localization and Mapping Using a Monte Carlo Algorithm on a Multi-Level Surface.基于多层面上的蒙特卡罗算法的多机器人 2.5D 定位与建图。
Sensors (Basel). 2021 Jul 4;21(13):4588. doi: 10.3390/s21134588.
6
An inexpensive method for kinematic calibration of a parallel robot by using one hand-held camera as main sensor.一种使用单手持相机作为主传感器的并联机器人运动学标定的廉价方法。
Sensors (Basel). 2013 Aug 5;13(8):9941-65. doi: 10.3390/s130809941.
7
Localization of mobile robots using odometry and an external vision sensor.使用里程计和外部视觉传感器对移动机器人进行定位。
Sensors (Basel). 2010;10(4):3655-80. doi: 10.3390/s100403655. Epub 2010 Apr 13.
8
Multi-camera sensor system for 3D segmentation and localization of multiple mobile robots.多摄像机传感器系统用于对多个移动机器人进行 3D 分割和定位。
Sensors (Basel). 2010;10(4):3261-79. doi: 10.3390/s100403261. Epub 2010 Apr 1.
9
SLAM algorithm applied to robotics assistance for navigation in unknown environments.SLAM 算法在机器人辅助未知环境导航中的应用。
J Neuroeng Rehabil. 2010 Feb 17;7:10. doi: 10.1186/1743-0003-7-10.
10
Calibration of an outdoor distributed camera network with a 3D point cloud.利用三维点云对室外分布式相机网络进行校准。
Sensors (Basel). 2014 Jul 29;14(8):13708-29. doi: 10.3390/s140813708.

引用本文的文献

1
Usability, occupational performance and satisfaction evaluation of a smart environment controlled by infrared oculography by people with severe motor disabilities.严重运动障碍者通过红外眼动控制的智能环境的可用性、职业表现和满意度评估。
PLoS One. 2021 Aug 13;16(8):e0256062. doi: 10.1371/journal.pone.0256062. eCollection 2021.
2
An Optimized, Data Distribution Service-Based Solution for Reliable Data Exchange Among Autonomous Underwater Vehicles.一种基于数据分发服务的优化解决方案,用于自主水下航行器之间的可靠数据交换。
Sensors (Basel). 2017 Aug 5;17(8):1802. doi: 10.3390/s17081802.
3
On-Board Event-Based State Estimation for Trajectory Approaching and Tracking of a Vehicle.

本文引用的文献

1
Self-organized multi-camera network for a fast and easy deployment of ubiquitous robots in unknown environments.自组织多摄像机网络,用于在未知环境中快速轻松地部署无处不在的机器人。
Sensors (Basel). 2012 Dec 27;13(1):426-54. doi: 10.3390/s130100426.
2
Localization of mobile robots using odometry and an external vision sensor.使用里程计和外部视觉传感器对移动机器人进行定位。
Sensors (Basel). 2010;10(4):3655-80. doi: 10.3390/s100403655. Epub 2010 Apr 13.
3
Multi-camera sensor system for 3D segmentation and localization of multiple mobile robots.
用于车辆轨迹逼近与跟踪的车载事件驱动状态估计
Sensors (Basel). 2015 Jun 19;15(6):14569-90. doi: 10.3390/s150614569.
多摄像机传感器系统用于对多个移动机器人进行 3D 分割和定位。
Sensors (Basel). 2010;10(4):3261-79. doi: 10.3390/s100403261. Epub 2010 Apr 1.
4
Decentralized sensor fusion for Ubiquitous Networking Robotics in Urban Areas.分散式传感器融合在城市无处不在网络机器人中的应用
Sensors (Basel). 2010;10(3):2274-314. doi: 10.3390/s100302274. Epub 2010 Mar 19.