• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于 SLAM 的双目立体视觉系统实时自标定。

SLAM-Based Self-Calibration of a Binocular Stereo Vision Rig in Real-Time.

机构信息

State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150001, China.

Industrial Research Institute of Robotics and Intelligent Equipment, Harbin Institute of Technology, Weihai 264209, China.

出版信息

Sensors (Basel). 2020 Jan 22;20(3):621. doi: 10.3390/s20030621.

DOI:10.3390/s20030621
PMID:31979170
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7038334/
Abstract

The calibration problem of binocular stereo vision rig is critical for its practical application. However, most existing calibration methods are based on manual off-line algorithms for specific reference targets or patterns. In this paper, we propose a novel simultaneous localization and mapping (SLAM)-based self-calibration method designed to achieve real-time, automatic and accurate calibration of the binocular stereo vision (BSV) rig's extrinsic parameters in a short period without auxiliary equipment and special calibration markers, assuming the intrinsic parameters of the left and right cameras are known in advance. The main contribution of this paper is to use the SLAM algorithm as our main tool for the calibration method. The method mainly consists of two parts: SLAM-based construction of 3D scene point map and extrinsic parameter calibration. In the first part, the SLAM mainly constructs a 3D feature point map of the natural environment, which is used as a calibration area map. To improve the efficiency of calibration, a lightweight, real-time visual SLAM is built. In the second part, extrinsic parameters are calibrated through the 3D scene point map created by the SLAM. Ultimately, field experiments are performed to evaluate the feasibility, repeatability, and efficiency of our self-calibration method. The experimental data shows that the average absolute error of the Euler angles and translation vectors obtained by our method relative to the reference values obtained by Zhang's calibration method does not exceed 0.5˚ and 2 mm, respectively. The distribution range of the most widely spread parameter in Euler angles is less than 0.2˚ while that in translation vectors does not exceed 2.15 mm. Under the general texture scene and the normal driving speed of the mobile robot, the calibration time can be generally maintained within 10 s. The above results prove that our proposed method is reliable and has practical value.

摘要

双目立体视觉标定问题对于其实际应用至关重要。然而,大多数现有的标定方法都是基于特定参考目标或模式的手动离线算法。在本文中,我们提出了一种新的基于同时定位与地图构建 (SLAM) 的自标定方法,旨在实现在短时间内无需辅助设备和特殊标定标记的情况下,实时、自动和精确地标定双目立体视觉 (BSV) 系统的外部参数,同时假设左右相机的内部参数已知。本文的主要贡献在于将 SLAM 算法用作标定方法的主要工具。该方法主要由两部分组成:基于 SLAM 的 3D 场景点图构建和外部参数标定。在第一部分中,SLAM 主要构建自然环境的 3D 特征点图,用作标定区域图。为了提高标定效率,构建了一种轻量级、实时的视觉 SLAM。在第二部分中,通过 SLAM 构建的 3D 场景点图进行外部参数标定。最终,通过现场实验评估了我们的自标定方法的可行性、重复性和效率。实验数据表明,与 Zhang 标定方法获得的参考值相比,我们的方法获得的欧拉角和平移向量的平均绝对误差不超过 0.5˚和 2mm,分别。欧拉角中最广泛分布参数的分布范围小于 0.2˚,而平移向量不超过 2.15mm。在一般纹理场景和移动机器人的正常行驶速度下,标定时间通常可以保持在 10s 以内。上述结果证明了我们提出的方法是可靠的,具有实用价值。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/483ffeee06b5/sensors-20-00621-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/9857adf46a2d/sensors-20-00621-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/232b61065846/sensors-20-00621-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/85e48f67bfef/sensors-20-00621-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/cb13926df36a/sensors-20-00621-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/b249d20e5f6d/sensors-20-00621-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/100e6057d77c/sensors-20-00621-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/ade60451f943/sensors-20-00621-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/8ef84d1292b6/sensors-20-00621-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/602bad9e22a3/sensors-20-00621-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/f0eeb8e1745d/sensors-20-00621-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/0bae83df0e7a/sensors-20-00621-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/e93ccb87cbbe/sensors-20-00621-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/13b799b4b0d6/sensors-20-00621-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/6ef78f6fc276/sensors-20-00621-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/483ffeee06b5/sensors-20-00621-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/9857adf46a2d/sensors-20-00621-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/232b61065846/sensors-20-00621-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/85e48f67bfef/sensors-20-00621-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/cb13926df36a/sensors-20-00621-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/b249d20e5f6d/sensors-20-00621-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/100e6057d77c/sensors-20-00621-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/ade60451f943/sensors-20-00621-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/8ef84d1292b6/sensors-20-00621-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/602bad9e22a3/sensors-20-00621-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/f0eeb8e1745d/sensors-20-00621-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/0bae83df0e7a/sensors-20-00621-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/e93ccb87cbbe/sensors-20-00621-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/13b799b4b0d6/sensors-20-00621-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/6ef78f6fc276/sensors-20-00621-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9878/7038334/483ffeee06b5/sensors-20-00621-g015.jpg

相似文献

1
SLAM-Based Self-Calibration of a Binocular Stereo Vision Rig in Real-Time.基于 SLAM 的双目立体视觉系统实时自标定。
Sensors (Basel). 2020 Jan 22;20(3):621. doi: 10.3390/s20030621.
2
A Method for Extrinsic Parameter Calibration of Rotating Binocular Stereo Vision Using a Single Feature Point.基于单个特征点的旋转双目立体视觉外部参数标定方法
Sensors (Basel). 2018 Oct 29;18(11):3666. doi: 10.3390/s18113666.
3
Estimation of extrinsic parameters for dynamic binocular stereo vision using unknown-sized rectangle images.使用未知尺寸矩形图像估计动态双目立体视觉的外部参数。
Rev Sci Instrum. 2019 Jun;90(6):065108. doi: 10.1063/1.5086352.
4
Real-Time Dense Reconstruction with Binocular Endoscopy Based on StereoNet and ORB-SLAM.基于 StereoNet 和 ORB-SLAM 的双目内窥镜实时稠密重建。
Sensors (Basel). 2023 Feb 12;23(4):2074. doi: 10.3390/s23042074.
5
A Stable, Efficient, and High-Precision Non-Coplanar Calibration Method: Applied for Multi-Camera-Based Stereo Vision Measurements.一种稳定、高效且高精度的非共面校准方法:应用于基于多相机的立体视觉测量
Sensors (Basel). 2023 Oct 14;23(20):8466. doi: 10.3390/s23208466.
6
Research on 3D Reconstruction of Binocular Vision Based on Thermal Infrared.基于热红外的双目视觉三维重建研究
Sensors (Basel). 2023 Aug 24;23(17):7372. doi: 10.3390/s23177372.
7
NMC3D: Non-Overlapping Multi-Camera Calibration Based on Sparse 3D Map.NMC3D:基于稀疏三维地图的非重叠多相机校准
Sensors (Basel). 2024 Aug 13;24(16):5228. doi: 10.3390/s24165228.
8
TIMA SLAM: Tracking Independently and Mapping Altogether for an Uncalibrated Multi-Camera System.TIMA SLAM:针对未校准的多摄像机系统的独立跟踪与整体映射。
Sensors (Basel). 2021 Jan 8;21(2):409. doi: 10.3390/s21020409.
9
Planar self-calibration for stereo cameras with radial distortion.具有径向畸变的立体相机的平面自校准
Appl Opt. 2017 Nov 20;56(33):9257-9267. doi: 10.1364/AO.56.009257.
10
Dense RGB-D SLAM with Multiple Cameras.多相机稠密 RGB-D SLAM。
Sensors (Basel). 2018 Jul 2;18(7):2118. doi: 10.3390/s18072118.

引用本文的文献

1
A Systematic Stereo Camera Calibration Strategy: Leveraging Latin Hypercube Sampling and 2 Full-Factorial Design of Experiment Methods.一种系统的立体相机校准策略:利用拉丁超立方抽样和两种全因子实验设计方法
Sensors (Basel). 2023 Oct 3;23(19):8240. doi: 10.3390/s23198240.
2
High Precision Calibration Algorithm for Binocular Stereo Vision Camera using Deep Reinforcement Learning.基于深度强化学习的双目立体视觉相机高精度标定算法。
Comput Intell Neurosci. 2022 Mar 31;2022:6596868. doi: 10.1155/2022/6596868. eCollection 2022.
3
Real-Time Plane Detection with Consistency from Point Cloud Sequences.

本文引用的文献

1
Stereo calibration with absolute phase target.使用绝对相位目标进行立体校准。
Opt Express. 2019 Aug 5;27(16):22254-22267. doi: 10.1364/OE.27.022254.
2
A Method for Extrinsic Parameter Calibration of Rotating Binocular Stereo Vision Using a Single Feature Point.基于单个特征点的旋转双目立体视觉外部参数标定方法
Sensors (Basel). 2018 Oct 29;18(11):3666. doi: 10.3390/s18113666.
3
Motorcycle That See: Multifocal Stereo Vision Sensor for Advanced Safety Systems in Tilting Vehicles.具备视觉的摩托车:用于倾斜车辆先进安全系统的多焦点立体视觉传感器
基于点云序列的实时平面检测与一致性
Sensors (Basel). 2020 Dec 28;21(1):140. doi: 10.3390/s21010140.
Sensors (Basel). 2018 Jan 19;18(1):295. doi: 10.3390/s18010295.
4
A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks.一种用于RGB-D相机网络的快速且稳健的外部校准方法。
Sensors (Basel). 2018 Jan 15;18(1):235. doi: 10.3390/s18010235.
5
Effective Data-Driven Calibration for a Galvanometric Laser Scanning System Using Binocular Stereo Vision.使用双目立体视觉对振镜式激光扫描系统进行有效的数据驱动校准
Sensors (Basel). 2018 Jan 12;18(1):197. doi: 10.3390/s18010197.
6
A High Precision Approach to Calibrate a Structured Light Vision Sensor in a Robot-Based Three-Dimensional Measurement System.一种在基于机器人的三维测量系统中校准结构光视觉传感器的高精度方法。
Sensors (Basel). 2016 Aug 30;16(9):1388. doi: 10.3390/s16091388.
7
Structural Parameters Calibration for Binocular Stereo Vision Sensors Using a Double-Sphere Target.基于双球靶标的双目立体视觉传感器结构参数标定
Sensors (Basel). 2016 Jul 12;16(7):1074. doi: 10.3390/s16071074.
8
Feature-based Lucas-Kanade and active appearance models.基于特征的 Lucas-Kanade 算法和主动外观模型。
IEEE Trans Image Process. 2015 Sep;24(9):2617-32. doi: 10.1109/TIP.2015.2431445. Epub 2015 May 8.
9
n-SIFT: n-dimensional scale invariant feature transform.n-SIFT:n维尺度不变特征变换。
IEEE Trans Image Process. 2009 Sep;18(9):2012-21. doi: 10.1109/TIP.2009.2024578. Epub 2009 Jun 5.
10
Image intensifier distortion correction.影像增强器失真校正。
Med Phys. 1987 Mar-Apr;14(2):249-52. doi: 10.1118/1.596078.