Suppr超能文献

基于立体事件的视觉惯性里程计

Stereo Event-Based Visual-Inertial Odometry.

作者信息

Wang Kunfeng, Zhao Kaichun, Lu Wenshuai, You Zheng

机构信息

Department of Precision Instrument, Tsinghua University, Beijing 100080, China.

出版信息

Sensors (Basel). 2025 Jan 31;25(3):887. doi: 10.3390/s25030887.

Abstract

Event-based cameras are a new type of vision sensor in which pixels operate independently and respond asynchronously to changes in brightness with microsecond resolution, instead of providing standard intensity frames. Compared with traditional cameras, event-based cameras have low latency, no motion blur, and high dynamic range (HDR), which provide possibilities for robots to deal with some challenging scenes. We propose a visual-inertial odometry for stereo event-based cameras based on Error-State Kalman Filter (ESKF). The vision module updates the pose by relying on the edge alignment of a semi-dense 3D map to a 2D image, while the IMU module updates the pose using median integration. We evaluate our method on public datasets with general 6-DoF motion (three-axis translation and three-axis rotation) and compare the results against the ground truth. We compared our results with those from other methods, demonstrating the effectiveness of our approach.

摘要

基于事件的相机是一种新型视觉传感器,其像素独立运行,并以微秒级分辨率对亮度变化进行异步响应,而不是提供标准强度帧。与传统相机相比,基于事件的相机具有低延迟、无运动模糊和高动态范围(HDR)的特点,这为机器人处理一些具有挑战性的场景提供了可能性。我们提出了一种基于误差状态卡尔曼滤波器(ESKF)的用于立体基于事件相机的视觉惯性里程计。视觉模块通过依赖半密集3D地图到2D图像的边缘对齐来更新位姿,而惯性测量单元(IMU)模块使用中位数积分来更新位姿。我们在具有一般6自由度运动(三轴平移和三轴旋转)的公共数据集上评估我们的方法,并将结果与地面真值进行比较。我们将我们的结果与其他方法的结果进行比较,证明了我们方法的有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0065/11819757/c5acd2d34cf1/sensors-25-00887-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验