• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

RGB-D 单向跟踪的混合对应几何积分。

Geometric Integration of Hybrid Correspondences for RGB-D Unidirectional Tracking.

机构信息

Research Institute for Smart Cities, School of Architecture and Urban Planning, Shenzhen University, Shenzhen 518060, China.

Department of Land Surveying & Geo-Informatics, The Hong Kong Polytechnic University, Hung Hom 999077, Hong Kong, China.

出版信息

Sensors (Basel). 2018 May 1;18(5):1385. doi: 10.3390/s18051385.

DOI:10.3390/s18051385
PMID:29723974
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5982696/
Abstract

Traditionally, visual-based RGB-D SLAM systems only use correspondences with valid depth values for camera tracking, thus ignoring the regions without 3D information. Due to the strict limitation on measurement distance and view angle, such systems adopt only short-range constraints which may introduce larger drift errors during long-distance unidirectional tracking. In this paper, we propose a novel geometric integration method that makes use of both 2D and 3D correspondences for RGB-D tracking. Our method handles the problem by exploring visual features both when depth information is available and when it is unknown. The system comprises two parts: coarse pose tracking with 3D correspondences, and geometric integration with hybrid correspondences. First, the coarse pose tracking generates the initial camera pose using 3D correspondences with frame-by-frame registration. The initial camera poses are then used as inputs for the geometric integration model, along with 3D correspondences, 2D-3D correspondences and 2D correspondences identified from frame pairs. The initial 3D location of the correspondence is determined in two ways, from depth image and by using the initial poses to triangulate. The model improves the camera poses and decreases drift error during long-distance RGB-D tracking iteratively. Experiments were conducted using data sequences collected by commercial Structure Sensors. The results verify that the geometric integration of hybrid correspondences effectively decreases the drift error and improves mapping accuracy. Furthermore, the model enables a comparative and synergistic use of datasets, including both 2D and 3D features.

摘要

传统的基于视觉的 RGB-D SLAM 系统仅使用具有有效深度值的对应关系进行相机跟踪,从而忽略了没有 3D 信息的区域。由于对测量距离和视角的严格限制,此类系统仅采用短距离约束,这在长距离单向跟踪过程中可能会引入更大的漂移误差。在本文中,我们提出了一种新颖的几何集成方法,该方法利用 RGB-D 跟踪的 2D 和 3D 对应关系。我们的方法通过探索深度信息可用和不可用时的视觉特征来处理该问题。该系统包括两部分:使用 3D 对应关系进行的粗粒度姿态跟踪,以及使用混合对应关系进行的几何集成。首先,粗粒度姿态跟踪使用逐帧注册的 3D 对应关系生成初始相机姿态。然后,将初始相机姿态用作几何集成模型的输入,以及 3D 对应关系、2D-3D 对应关系和从帧对中识别出的 2D 对应关系。对应关系的初始 3D 位置有两种确定方式,一种是从深度图像中确定,另一种是使用初始姿态进行三角测量。该模型通过迭代改进相机姿态并减少长距离 RGB-D 跟踪中的漂移误差。使用商业结构传感器采集的数据序列进行了实验。结果验证了混合对应关系的几何集成有效地减少了漂移误差并提高了映射精度。此外,该模型还能够对包括 2D 和 3D 特征在内的数据集进行比较和协同使用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/18064fd7d8e4/sensors-18-01385-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/030fcd40f237/sensors-18-01385-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/eac3821c6578/sensors-18-01385-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/ba760d88fa51/sensors-18-01385-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/f2414b6024cb/sensors-18-01385-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/e600f1e8d7d2/sensors-18-01385-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/496513e97c66/sensors-18-01385-g006a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/330e0c0ac198/sensors-18-01385-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/8be4b75a4e43/sensors-18-01385-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/46dd69278e5c/sensors-18-01385-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/442af41e0a3a/sensors-18-01385-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/c74ec073cf28/sensors-18-01385-g011a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/36fb25b7cc51/sensors-18-01385-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/c0821fb27a00/sensors-18-01385-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/18064fd7d8e4/sensors-18-01385-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/030fcd40f237/sensors-18-01385-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/eac3821c6578/sensors-18-01385-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/ba760d88fa51/sensors-18-01385-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/f2414b6024cb/sensors-18-01385-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/e600f1e8d7d2/sensors-18-01385-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/496513e97c66/sensors-18-01385-g006a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/330e0c0ac198/sensors-18-01385-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/8be4b75a4e43/sensors-18-01385-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/46dd69278e5c/sensors-18-01385-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/442af41e0a3a/sensors-18-01385-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/c74ec073cf28/sensors-18-01385-g011a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/36fb25b7cc51/sensors-18-01385-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/c0821fb27a00/sensors-18-01385-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06c2/5982696/18064fd7d8e4/sensors-18-01385-g014.jpg

相似文献

1
Geometric Integration of Hybrid Correspondences for RGB-D Unidirectional Tracking.RGB-D 单向跟踪的混合对应几何积分。
Sensors (Basel). 2018 May 1;18(5):1385. doi: 10.3390/s18051385.
2
Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling.用于详细3D室内和室外建模的增强型RGB-D映射方法
Sensors (Basel). 2016 Sep 27;16(10):1589. doi: 10.3390/s16101589.
3
RGB-D SLAM Based on Extended Bundle Adjustment with 2D and 3D Information.基于具有二维和三维信息的扩展光束平差的RGB-D同步定位与地图构建
Sensors (Basel). 2016 Aug 13;16(8):1285. doi: 10.3390/s16081285.
4
Fast and Accurate Pose Estimation with Unknown Focal Length Using Line Correspondences.利用线对应关系进行未知焦距下的快速准确姿态估计
Sensors (Basel). 2022 Oct 28;22(21):8253. doi: 10.3390/s22218253.
5
A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks.一种用于RGB-D相机网络的快速且稳健的外部校准方法。
Sensors (Basel). 2018 Jan 15;18(1):235. doi: 10.3390/s18010235.
6
Robust RGB-D SLAM Using Point and Line Features for Low Textured Scene.基于点线特征的鲁棒RGB-D SLAM用于低纹理场景
Sensors (Basel). 2020 Sep 2;20(17):4984. doi: 10.3390/s20174984.
7
Dense RGB-D SLAM with Multiple Cameras.多相机稠密 RGB-D SLAM。
Sensors (Basel). 2018 Jul 2;18(7):2118. doi: 10.3390/s18072118.
8
Point Cloud Registration Method Based on Geometric Constraint and Transformation Evaluation.基于几何约束和变换评估的点云配准方法
Sensors (Basel). 2024 Mar 14;24(6):1853. doi: 10.3390/s24061853.
9
Robust Fusion of Color and Depth Data for RGB-D Target Tracking Using Adaptive Range-Invariant Depth Models and Spatio-Temporal Consistency Constraints.基于自适应范围不变深度模型和时空一致性约束的 RGB-D 目标跟踪的彩色和深度数据的鲁棒融合。
IEEE Trans Cybern. 2018 Aug;48(8):2485-2499. doi: 10.1109/TCYB.2017.2740952. Epub 2017 Sep 6.
10
Geometric calibration for LiDAR-camera system fusing 3D-2D and 3D-3D point correspondences.用于融合3D-2D和3D-3D点对应关系的激光雷达-相机系统的几何校准。
Opt Express. 2020 Jan 20;28(2):2122-2141. doi: 10.1364/OE.381176.

引用本文的文献

1
Robust RGB-D SLAM Using Point and Line Features for Low Textured Scene.基于点线特征的鲁棒RGB-D SLAM用于低纹理场景
Sensors (Basel). 2020 Sep 2;20(17):4984. doi: 10.3390/s20174984.
2
Emergency Response Using Volunteered Passenger Aircraft Remote Sensing Data: A Case Study on Flood Damage Mapping.利用志愿乘客飞机遥感数据进行应急响应:以洪水灾害测绘为例。
Sensors (Basel). 2019 Sep 25;19(19):4163. doi: 10.3390/s19194163.
3
Fast and Automatic Reconstruction of Semantically Rich 3D Indoor Maps from Low-quality RGB-D Sequences.快速且自动从低质量 RGB-D 序列重建语义丰富的 3D 室内地图。

本文引用的文献

1
A New Calibration Method for Commercial RGB-D Sensors.一种用于商用RGB-D传感器的新校准方法。
Sensors (Basel). 2017 May 24;17(6):1204. doi: 10.3390/s17061204.
2
Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling.用于详细3D室内和室外建模的增强型RGB-D映射方法
Sensors (Basel). 2016 Sep 27;16(10):1589. doi: 10.3390/s16101589.
3
Accuracy and resolution of Kinect depth data for indoor mapping applications.用于室内制图应用的 Kinect 深度数据的准确性和分辨率。
Sensors (Basel). 2019 Jan 27;19(3):533. doi: 10.3390/s19030533.
Sensors (Basel). 2012;12(2):1437-54. doi: 10.3390/s120201437. Epub 2012 Feb 1.