• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

多相机稠密 RGB-D SLAM。

Dense RGB-D SLAM with Multiple Cameras.

机构信息

National Laboratory of Pattern Recognition (NLPR), Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China.

School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China.

出版信息

Sensors (Basel). 2018 Jul 2;18(7):2118. doi: 10.3390/s18072118.

DOI:10.3390/s18072118
PMID:30004420
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6068657/
Abstract

A multi-camera dense RGB-D SLAM (simultaneous localization and mapping) system has the potential both to speed up scene reconstruction and to improve localization accuracy, thanks to multiple mounted sensors and an enlarged effective field of view. To effectively tap the potential of the system, two issues must be understood: first, how to calibrate the system where sensors usually shares small or no common field of view to maximally increase the effective field of view; second, how to fuse the location information from different sensors. In this work, a three-Kinect system is reported. For system calibration, two kinds of calibration methods are proposed, one is suitable for system with inertial measurement unit (IMU) using an improved hand⁻eye calibration method, the other for pure visual SLAM without any other auxiliary sensors. In the RGB-D SLAM stage, we extend and improve a state-of-art single RGB-D SLAM method to multi-camera system. We track the multiple cameras' poses independently and select the one with the pose minimal-error as the reference pose at each moment to correct other cameras' poses. To optimize the initial estimated pose, we improve the deformation graph by adding an attribute of device number to distinguish surfels built by different cameras and do deformations according to the device number. We verify the accuracy of our extrinsic calibration methods in the experiment section and show the satisfactory reconstructed models by our multi-camera dense RGB-D SLAM. The RMSE (root-mean-square error) of the lengths measured in our reconstructed mode is 1.55 cm (similar to the state-of-art single camera RGB-D SLAM systems).

摘要

多相机密集 RGB-D SLAM(同时定位与建图)系统具有加快场景重建和提高定位精度的潜力,这要归功于多个安装的传感器和扩大的有效视野。为了有效地利用系统的潜力,必须理解两个问题:首先,如何在传感器通常共享小或没有共同视场的情况下对系统进行校准,以最大限度地增加有效视场;其次,如何融合来自不同传感器的位置信息。在这项工作中,报告了一个三 Kinect 系统。对于系统校准,提出了两种校准方法,一种适用于带有惯性测量单元(IMU)的系统,使用改进的手眼校准方法,另一种适用于没有任何其他辅助传感器的纯视觉 SLAM。在 RGB-D SLAM 阶段,我们扩展和改进了一种最先进的单 RGB-D SLAM 方法,使其适用于多相机系统。我们独立跟踪多个相机的姿势,并选择每个时刻具有最小姿势误差的相机作为参考姿势,以校正其他相机的姿势。为了优化初始估计姿势,我们通过添加设备号属性来改进变形图,以区分由不同相机构建的 surfels,并根据设备号进行变形。我们在实验部分验证了外部校准方法的准确性,并展示了我们的多相机密集 RGB-D SLAM 的令人满意的重建模型。我们重建模式下测量的长度的 RMSE(均方根误差)为 1.55 厘米(与最先进的单相机 RGB-D SLAM 系统相似)。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/42b797ceea9e/sensors-18-02118-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/09c577962172/sensors-18-02118-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/9637aa491189/sensors-18-02118-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/20baf1635e8f/sensors-18-02118-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/dad84ecc78d1/sensors-18-02118-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/88cb1c12833d/sensors-18-02118-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/5ad872f09a67/sensors-18-02118-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/cead3673cdf6/sensors-18-02118-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/e554c88f5b12/sensors-18-02118-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/42b797ceea9e/sensors-18-02118-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/09c577962172/sensors-18-02118-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/9637aa491189/sensors-18-02118-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/20baf1635e8f/sensors-18-02118-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/dad84ecc78d1/sensors-18-02118-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/88cb1c12833d/sensors-18-02118-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/5ad872f09a67/sensors-18-02118-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/cead3673cdf6/sensors-18-02118-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/e554c88f5b12/sensors-18-02118-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/49e6/6068657/42b797ceea9e/sensors-18-02118-g009.jpg

相似文献

1
Dense RGB-D SLAM with Multiple Cameras.多相机稠密 RGB-D SLAM。
Sensors (Basel). 2018 Jul 2;18(7):2118. doi: 10.3390/s18072118.
2
TIMA SLAM: Tracking Independently and Mapping Altogether for an Uncalibrated Multi-Camera System.TIMA SLAM:针对未校准的多摄像机系统的独立跟踪与整体映射。
Sensors (Basel). 2021 Jan 8;21(2):409. doi: 10.3390/s21020409.
3
A New Model of RGB-D Camera Calibration Based On 3D Control Field.基于 3D 控制场的 RGB-D 相机标定新模型。
Sensors (Basel). 2019 Nov 21;19(23):5082. doi: 10.3390/s19235082.
4
Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling.用于详细3D室内和室外建模的增强型RGB-D映射方法
Sensors (Basel). 2016 Sep 27;16(10):1589. doi: 10.3390/s16101589.
5
Real-Time Large-Scale Dense Mapping with Surfels.基于 Surfels 的实时大规模稠密建图
Sensors (Basel). 2018 May 9;18(5):1493. doi: 10.3390/s18051493.
6
Robust RGB-D SLAM Using Point and Line Features for Low Textured Scene.基于点线特征的鲁棒RGB-D SLAM用于低纹理场景
Sensors (Basel). 2020 Sep 2;20(17):4984. doi: 10.3390/s20174984.
7
A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks.一种用于RGB-D相机网络的快速且稳健的外部校准方法。
Sensors (Basel). 2018 Jan 15;18(1):235. doi: 10.3390/s18010235.
8
A Novel Method for Extrinsic Calibration of Multiple RGB-D Cameras Using Descriptor-Based Patterns.基于描述符的模式的多 RGB-D 相机外部标定新方法。
Sensors (Basel). 2019 Jan 16;19(2):349. doi: 10.3390/s19020349.
9
CVIDS: A Collaborative Localization and Dense Mapping Framework for Multi-Agent Based Visual-Inertial SLAM.CVIDS:一种用于基于多智能体的视觉惯性同步定位与地图构建的协作式定位与密集建图框架。
IEEE Trans Image Process. 2022;31:6562-6576. doi: 10.1109/TIP.2022.3213189. Epub 2022 Oct 21.
10
SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality.基于 SLAM 的单目微创手术中密集表面重建及其在增强现实中的应用。
Comput Methods Programs Biomed. 2018 May;158:135-146. doi: 10.1016/j.cmpb.2018.02.006. Epub 2018 Feb 8.

引用本文的文献

1
A review of visual SLAM for robotics: evolution, properties, and future applications.机器人视觉同步定位与地图构建综述:演进、特性及未来应用
Front Robot AI. 2024 Apr 10;11:1347985. doi: 10.3389/frobt.2024.1347985. eCollection 2024.
2
Point-Plane SLAM Using Supposed Planes for Indoor Environments.使用假定平面的点-平面同步定位与地图构建用于室内环境
Sensors (Basel). 2019 Sep 2;19(17):3795. doi: 10.3390/s19173795.
3
An Orthogonal Weighted Occupancy Likelihood Map with IMU-Aided Laser Scan Matching for 2D Indoor Mapping.一种用于二维室内地图绘制的基于惯性测量单元辅助激光扫描匹配的正交加权占用似然图。

本文引用的文献

1
Real-Time Large-Scale Dense Mapping with Surfels.基于 Surfels 的实时大规模稠密建图
Sensors (Basel). 2018 May 9;18(5):1493. doi: 10.3390/s18051493.
2
A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks.一种用于RGB-D相机网络的快速且稳健的外部校准方法。
Sensors (Basel). 2018 Jan 15;18(1):235. doi: 10.3390/s18010235.
3
A Quantitative Comparison of Calibration Methods for RGB-D Sensors Using Different Technologies.使用不同技术的RGB-D传感器校准方法的定量比较。
Sensors (Basel). 2019 Apr 11;19(7):1742. doi: 10.3390/s19071742.
4
RGB-D SLAM with Manhattan Frame Estimation Using Orientation Relevance.基于方向相关性的曼哈顿框架估计的 RGB-D SLAM
Sensors (Basel). 2019 Mar 1;19(5):1050. doi: 10.3390/s19051050.
5
Fast and Automatic Reconstruction of Semantically Rich 3D Indoor Maps from Low-quality RGB-D Sequences.快速且自动从低质量 RGB-D 序列重建语义丰富的 3D 室内地图。
Sensors (Basel). 2019 Jan 27;19(3):533. doi: 10.3390/s19030533.
6
Robust and Efficient CPU-Based RGB-D Scene Reconstruction.基于 CPU 的鲁棒高效 RGB-D 场景重建。
Sensors (Basel). 2018 Oct 28;18(11):3652. doi: 10.3390/s18113652.
Sensors (Basel). 2017 Jan 27;17(2):243. doi: 10.3390/s17020243.
4
Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling.用于详细3D室内和室外建模的增强型RGB-D映射方法
Sensors (Basel). 2016 Sep 27;16(10):1589. doi: 10.3390/s16101589.
5
RGB-D SLAM Based on Extended Bundle Adjustment with 2D and 3D Information.基于具有二维和三维信息的扩展光束平差的RGB-D同步定位与地图构建
Sensors (Basel). 2016 Aug 13;16(8):1285. doi: 10.3390/s16081285.
6
Very high frame rate volumetric integration of depth images on mobile devices.移动设备上深度图像的超高帧率体积集成。
IEEE Trans Vis Comput Graph. 2015 Nov;21(11):1241-50. doi: 10.1109/TVCG.2015.2459891.
7
Least-squares fitting of two 3-d point sets.最小二乘拟合两个三维点集。
IEEE Trans Pattern Anal Mach Intell. 1987 May;9(5):698-700. doi: 10.1109/tpami.1987.4767965.