• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于视觉与惯性传感器组合的移动机器人室内定位

Mobile Robot Indoor Positioning Based on a Combination of Visual and Inertial Sensors.

作者信息

Gao Mingjing, Yu Min, Guo Hang, Xu Yuan

机构信息

Institute of Space Science and Technology, Nanchang University, Nanchang 330031, China.

College of Computer Information and Engineering, Jiangxi Normal University, Nanchang 330022, China.

出版信息

Sensors (Basel). 2019 Apr 13;19(8):1773. doi: 10.3390/s19081773.

DOI:10.3390/s19081773
PMID:31013897
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6515221/
Abstract

Multi-sensor integrated navigation technology has been applied to the indoor navigation and positioning of robots. For the problems of a low navigation accuracy and error accumulation, for mobile robots with a single sensor, an indoor mobile robot positioning method based on a visual and inertial sensor combination is presented in this paper. First, the visual sensor (Kinect) is used to obtain the color image and the depth image, and feature matching is performed by the improved scale-invariant feature transform (SIFT) algorithm. Then, the absolute orientation algorithm is used to calculate the rotation matrix and translation vector of a robot in two consecutive frames of images. An inertial measurement unit (IMU) has the advantages of high frequency updating and rapid, accurate positioning, and can compensate for the Kinect speed and lack of precision. Three-dimensional data, such as acceleration, angular velocity, magnetic field strength, and temperature data, can be obtained in real-time with an IMU. The data obtained by the visual sensor is loosely combined with that obtained by the IMU, that is, the differences in the positions and attitudes of the two sensor outputs are optimally combined by the adaptive fade-out extended Kalman filter to estimate the errors. Finally, several experiments show that this method can significantly improve the accuracy of the indoor positioning of the mobile robots based on the visual and inertial sensors.

摘要

多传感器集成导航技术已应用于机器人的室内导航与定位。针对单传感器移动机器人导航精度低和误差累积的问题,本文提出了一种基于视觉与惯性传感器组合的室内移动机器人定位方法。首先,利用视觉传感器(Kinect)获取彩色图像和深度图像,并通过改进的尺度不变特征变换(SIFT)算法进行特征匹配。然后,使用绝对定向算法计算机器人在连续两帧图像中的旋转矩阵和平移向量。惯性测量单元(IMU)具有高频更新和快速、精确定位的优点,能够弥补Kinect速度和精度不足的问题。通过IMU可实时获取加速度、角速度、磁场强度和温度数据等三维数据。将视觉传感器获取的数据与IMU获取的数据进行松耦合,即通过自适应渐消扩展卡尔曼滤波器对两个传感器输出的位置和姿态差异进行最优组合以估计误差。最后,多个实验表明该方法能显著提高基于视觉和惯性传感器的移动机器人室内定位精度。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/7f7d0e5d4ce3/sensors-19-01773-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/98437e272ee8/sensors-19-01773-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/5ba3b36d0992/sensors-19-01773-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/827935745c06/sensors-19-01773-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/e8b1abd5470b/sensors-19-01773-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/03d0f5b35a6a/sensors-19-01773-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/ca9221e195da/sensors-19-01773-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/9f4a5c45ff6b/sensors-19-01773-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/d37ac9f4102c/sensors-19-01773-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/c4d706fcd308/sensors-19-01773-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/0eca095f9bbf/sensors-19-01773-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/f3553b2cf871/sensors-19-01773-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/7f7d0e5d4ce3/sensors-19-01773-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/98437e272ee8/sensors-19-01773-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/5ba3b36d0992/sensors-19-01773-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/827935745c06/sensors-19-01773-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/e8b1abd5470b/sensors-19-01773-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/03d0f5b35a6a/sensors-19-01773-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/ca9221e195da/sensors-19-01773-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/9f4a5c45ff6b/sensors-19-01773-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/d37ac9f4102c/sensors-19-01773-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/c4d706fcd308/sensors-19-01773-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/0eca095f9bbf/sensors-19-01773-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/f3553b2cf871/sensors-19-01773-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3feb/6515221/7f7d0e5d4ce3/sensors-19-01773-g012.jpg

相似文献

1
Mobile Robot Indoor Positioning Based on a Combination of Visual and Inertial Sensors.基于视觉与惯性传感器组合的移动机器人室内定位
Sensors (Basel). 2019 Apr 13;19(8):1773. doi: 10.3390/s19081773.
2
Research into Kinect/Inertial Measurement Units Based on Indoor Robots.基于室内机器人的Kinect/惯性测量单元研究
Sensors (Basel). 2018 Mar 12;18(3):839. doi: 10.3390/s18030839.
3
An Enhanced Hybrid Visual-Inertial Odometry System for Indoor Mobile Robot.一种用于室内移动机器人的增强型混合视觉惯性里程计系统。
Sensors (Basel). 2022 Apr 11;22(8):2930. doi: 10.3390/s22082930.
4
Pose Estimation of a Mobile Robot Based on Fusion of IMU Data and Vision Data Using an Extended Kalman Filter.基于扩展卡尔曼滤波器融合惯性测量单元(IMU)数据与视觉数据的移动机器人位姿估计
Sensors (Basel). 2017 Sep 21;17(10):2164. doi: 10.3390/s17102164.
5
A Loosely Coupled Extended Kalman Filter Algorithm for Agricultural Scene-Based Multi-Sensor Fusion.一种基于农业场景的多传感器融合的松散耦合扩展卡尔曼滤波算法。
Front Plant Sci. 2022 Apr 25;13:849260. doi: 10.3389/fpls.2022.849260. eCollection 2022.
6
Integrated Indoor Positioning System of Greenhouse Robot Based on UWB/IMU/ODOM/LIDAR.基于超宽带/惯性测量单元/里程计/激光雷达的温室机器人室内综合定位系统
Sensors (Basel). 2022 Jun 25;22(13):4819. doi: 10.3390/s22134819.
7
Improved Pedestrian Dead Reckoning Based on a Robust Adaptive Kalman Filter for Indoor Inertial Location System.基于鲁棒自适应卡尔曼滤波器的改进行人航位推算在室内惯性定位系统中的应用。
Sensors (Basel). 2019 Jan 12;19(2):294. doi: 10.3390/s19020294.
8
Robust Stereo Visual Inertial Navigation System Based on Multi-Stage Outlier Removal in Dynamic Environments.基于动态环境下多阶段异常值剔除的鲁棒立体视觉惯性导航系统。
Sensors (Basel). 2020 May 21;20(10):2922. doi: 10.3390/s20102922.
9
A Robust Indoor/Outdoor Navigation Filter Fusing Data from Vision and Magneto-Inertial Measurement Unit.一种融合视觉与磁惯性测量单元数据的稳健室内/室外导航滤波器。
Sensors (Basel). 2017 Dec 4;17(12):2795. doi: 10.3390/s17122795.
10
3D Indoor Position Estimation Based on a UDU Factorization Extended Kalman Filter Structure Using Beacon Distance and Inertial Measurement Unit Data.基于使用信标距离和惯性测量单元数据的UDU分解扩展卡尔曼滤波器结构的三维室内位置估计
Sensors (Basel). 2024 May 11;24(10):3048. doi: 10.3390/s24103048.

引用本文的文献

1
Multicooperation of Turtle-inspired amphibious spherical robots.仿龟两栖球形机器人的多协作
Sci Rep. 2025 Jan 23;15(1):2932. doi: 10.1038/s41598-025-85423-2.
2
Robotics Perception and Control: Key Technologies and Applications.机器人感知与控制:关键技术与应用
Micromachines (Basel). 2024 Apr 15;15(4):531. doi: 10.3390/mi15040531.
3
Adaptive Expectation-Maximization-Based Kalman Filter/Finite Impulse Response Filter for MEMS-INS-Based Posture Capture of Human Upper Limbs.基于自适应期望最大化的卡尔曼滤波器/有限脉冲响应滤波器用于基于微机电系统惯性导航系统的人体上肢姿态捕捉

本文引用的文献

1
A Kinect-based real-time compressive tracking prototype system for amphibious spherical robots.一种用于两栖球形机器人的基于Kinect的实时压缩跟踪原型系统。
Sensors (Basel). 2015 Apr 8;15(4):8232-52. doi: 10.3390/s150408232.
2
MonoSLAM: real-time single camera SLAM.单目即时定位与地图构建(MonoSLAM):实时单目相机即时定位与地图构建
IEEE Trans Pattern Anal Mach Intell. 2007 Jun;29(6):1052-67. doi: 10.1109/TPAMI.2007.1049.
Micromachines (Basel). 2024 Mar 26;15(4):440. doi: 10.3390/mi15040440.
4
A New Positioning Method for Climbing Robots Based on 3D Model of Transmission Tower and Visual Sensor.基于输电塔三维模型与视觉传感器的攀爬机器人定位新方法。
Sensors (Basel). 2022 Sep 26;22(19):7288. doi: 10.3390/s22197288.
5
Control System for Vertical Take-Off and Landing Vehicle's Adaptive Landing Based on Multi-Sensor Data Fusion.基于多传感器数据融合的垂直起降飞行器自适应着陆控制系统
Sensors (Basel). 2020 Aug 7;20(16):4411. doi: 10.3390/s20164411.
6
A Comprehensive Survey of Indoor Localization Methods Based on Computer Vision.基于计算机视觉的室内定位方法综述。
Sensors (Basel). 2020 May 6;20(9):2641. doi: 10.3390/s20092641.
7
An Improved Method for Spot Position Detection of a Laser Tracking and Positioning System Based on a Four-Quadrant Detector.基于四象限探测器的激光跟踪定位系统光斑位置检测改进方法
Sensors (Basel). 2019 Oct 30;19(21):4722. doi: 10.3390/s19214722.
8
A Meta-Review of Indoor Positioning Systems.室内定位系统的元分析综述
Sensors (Basel). 2019 Oct 17;19(20):4507. doi: 10.3390/s19204507.
9
Measurement Method Based on Multispectral Three-Dimensional Imaging for the Chlorophyll Contents of Greenhouse Tomato Plants.基于多光谱三维成像的温室番茄植株叶绿素含量测量方法。
Sensors (Basel). 2019 Jul 30;19(15):3345. doi: 10.3390/s19153345.
10
Extrinsic Parameter Calibration Method for a Visual/Inertial Integrated System with a Predefined Mechanical Interface.具有预定义机械接口的视觉/惯性集成系统的外部参数校准方法
Sensors (Basel). 2019 Jul 12;19(14):3086. doi: 10.3390/s19143086.