• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

透明和半透明场景下 Intel RealSense D415、SR305 和 L515 深度估计的实验评估

An Experimental Assessment of Depth Estimation in Transparent and Translucent Scenes for Intel RealSense D415, SR305 and L515.

机构信息

Institute of Systems and Robotics, Department of Electrical and Computer Engineering, University of Coimbra, 3030-290 Coimbra, Portugal.

出版信息

Sensors (Basel). 2022 Sep 28;22(19):7378. doi: 10.3390/s22197378.

DOI:10.3390/s22197378
PMID:36236472
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9572012/
Abstract

RGB-D cameras have become common in many research fields since these inexpensive devices provide dense 3D information from the observed scene. Over the past few years, the RealSense™ range from Intel has introduced new, cost-effective RGB-D sensors with different technologies, more sophisticated in both hardware and software. Models D415, SR305, and L515 are examples of successful cameras launched by Intel RealSense™ between 2018 and 2020. These three cameras are different since they have distinct operating principles. Then, their behavior concerning depth estimation while in the presence of many error sources will also be specific. For instance, semi-transparent and scattering media are expected error sources for an RGB-D sensor. The main new contribution of this paper is a full evaluation and comparison between the three Intel RealSense cameras in scenarios with transparency and translucency. We propose an experimental setup involving an aquarium and liquids. The evaluation, based on repeatability/precision and statistical distribution of the acquired depth, allows us to compare the three cameras and conclude that Intel RealSense D415 has overall the best behavior namely in what concerns the statistical variability (also known as precision or repeatability) and also in what concerns valid measurements.

摘要

RGB-D 相机在许多研究领域已经变得很常见,因为这些廉价的设备可以从观察到的场景中提供密集的 3D 信息。在过去的几年中,英特尔的 RealSense™系列推出了具有不同技术的新型、具有成本效益的 RGB-D 传感器,这些传感器在硬件和软件方面都更加复杂。D415、SR305 和 L515 是英特尔 RealSense™在 2018 年至 2020 年间推出的三款成功相机的示例。这三款相机的工作原理不同,因此它们在存在许多误差源时的深度估计行为也将是特定的。例如,半透明和散射介质是 RGB-D 传感器的预期误差源。本文的主要新贡献是在具有透明度和半透明度的场景中对这三款英特尔 RealSense 相机进行全面评估和比较。我们提出了一个涉及水族馆和液体的实验设置。基于获取深度的重复性/精度和统计分布的评估,使我们能够比较这三款相机,并得出结论,即英特尔 RealSense D415 的整体表现最佳,无论是在统计可变性(也称为精度或重复性)方面,还是在有效测量方面。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/576e35996c87/sensors-22-07378-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/1727a2916433/sensors-22-07378-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/e1a2f71415dc/sensors-22-07378-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6f8786c7599a/sensors-22-07378-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6bac2cd8b8e3/sensors-22-07378-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/3df0e7b56cdc/sensors-22-07378-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/595726a2eb11/sensors-22-07378-g0A6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/0877e38cd9a4/sensors-22-07378-g0A7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/66c617c14b9d/sensors-22-07378-g0A8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/7b0f8ab9c8fe/sensors-22-07378-g0A9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/9ee8c986d238/sensors-22-07378-g0A10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/5986749c0908/sensors-22-07378-g0A11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/ced6ae83d2db/sensors-22-07378-g0A12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6714fd8088ff/sensors-22-07378-g0A13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/c7616898fe68/sensors-22-07378-g0A14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/53f847f61167/sensors-22-07378-g0A15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/8c208e368a58/sensors-22-07378-g0A16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/d16deb615138/sensors-22-07378-g0A17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/b91006005a8c/sensors-22-07378-g0A18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/a88659909c9b/sensors-22-07378-g0A19.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/eb721f49e513/sensors-22-07378-g0A20.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/84b4385bc3db/sensors-22-07378-g0A21.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/3fce3e76ec2c/sensors-22-07378-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/d2839ffe8435/sensors-22-07378-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/9b9e3a3b249f/sensors-22-07378-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/66981e735a27/sensors-22-07378-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/e13c9d3eb72e/sensors-22-07378-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/1433c732e0bf/sensors-22-07378-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/0abda04a4866/sensors-22-07378-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/b6d1ee61f82d/sensors-22-07378-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6bbc2bd3de77/sensors-22-07378-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6a6e9c83d0f0/sensors-22-07378-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/7844a4117c94/sensors-22-07378-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/7b1c37811fc9/sensors-22-07378-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/52fbc0550695/sensors-22-07378-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/297413dd8d04/sensors-22-07378-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/ed9ad5f96ca6/sensors-22-07378-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/576e35996c87/sensors-22-07378-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/1727a2916433/sensors-22-07378-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/e1a2f71415dc/sensors-22-07378-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6f8786c7599a/sensors-22-07378-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6bac2cd8b8e3/sensors-22-07378-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/3df0e7b56cdc/sensors-22-07378-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/595726a2eb11/sensors-22-07378-g0A6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/0877e38cd9a4/sensors-22-07378-g0A7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/66c617c14b9d/sensors-22-07378-g0A8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/7b0f8ab9c8fe/sensors-22-07378-g0A9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/9ee8c986d238/sensors-22-07378-g0A10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/5986749c0908/sensors-22-07378-g0A11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/ced6ae83d2db/sensors-22-07378-g0A12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6714fd8088ff/sensors-22-07378-g0A13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/c7616898fe68/sensors-22-07378-g0A14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/53f847f61167/sensors-22-07378-g0A15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/8c208e368a58/sensors-22-07378-g0A16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/d16deb615138/sensors-22-07378-g0A17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/b91006005a8c/sensors-22-07378-g0A18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/a88659909c9b/sensors-22-07378-g0A19.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/eb721f49e513/sensors-22-07378-g0A20.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/84b4385bc3db/sensors-22-07378-g0A21.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/3fce3e76ec2c/sensors-22-07378-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/d2839ffe8435/sensors-22-07378-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/9b9e3a3b249f/sensors-22-07378-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/66981e735a27/sensors-22-07378-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/e13c9d3eb72e/sensors-22-07378-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/1433c732e0bf/sensors-22-07378-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/0abda04a4866/sensors-22-07378-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/b6d1ee61f82d/sensors-22-07378-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6bbc2bd3de77/sensors-22-07378-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/6a6e9c83d0f0/sensors-22-07378-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/7844a4117c94/sensors-22-07378-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/7b1c37811fc9/sensors-22-07378-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/52fbc0550695/sensors-22-07378-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/297413dd8d04/sensors-22-07378-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/ed9ad5f96ca6/sensors-22-07378-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3bf8/9572012/576e35996c87/sensors-22-07378-g018.jpg

相似文献

1
An Experimental Assessment of Depth Estimation in Transparent and Translucent Scenes for Intel RealSense D415, SR305 and L515.透明和半透明场景下 Intel RealSense D415、SR305 和 L515 深度估计的实验评估
Sensors (Basel). 2022 Sep 28;22(19):7378. doi: 10.3390/s22197378.
2
Metrological Characterization and Comparison of D415, D455, L515 RealSense Devices in the Close Range.近景距离下 D415、D455、L515 RealSense 设备的计量特性与比较
Sensors (Basel). 2021 Nov 22;21(22):7770. doi: 10.3390/s21227770.
3
Metrological and Critical Characterization of the Intel D415 Stereo Depth Camera.英特尔 D415 立体深度相机的计量和关键特性分析。
Sensors (Basel). 2019 Jan 25;19(3):489. doi: 10.3390/s19030489.
4
Analysis of Depth Cameras for Proximal Sensing of Grapes.分析用于葡萄近端感知的深度相机。
Sensors (Basel). 2022 May 31;22(11):4179. doi: 10.3390/s22114179.
5
Utilising the Intel RealSense Camera for Measuring Health Outcomes in Clinical Research.利用英特尔实感摄像头测量临床研究中的健康结果。
J Med Syst. 2018 Feb 5;42(3):53. doi: 10.1007/s10916-018-0905-x.
6
Grape Maturity Estimation Using Time-of-Flight and LiDAR Depth Cameras.使用飞行时间和激光雷达深度相机估计葡萄成熟度
Sensors (Basel). 2024 Aug 7;24(16):5109. doi: 10.3390/s24165109.
7
Pigs: A stepwise RGB-D novel pig carcass cutting dataset.猪:一个逐步构建的RGB-D新型猪胴体切割数据集。
Data Brief. 2022 Feb 15;41:107945. doi: 10.1016/j.dib.2022.107945. eCollection 2022 Apr.
8
Experimental Setup for Evaluating Depth Sensors in Augmented Reality Technologies Used in Medical Devices.评估医疗设备中使用的增强现实技术的深度传感器的实验设置。
Sensors (Basel). 2024 Jun 17;24(12):3916. doi: 10.3390/s24123916.
9
Expanding the Detection of Traversable Area with RealSense for the Visually Impaired.利用英特尔实感技术扩展视障人士可通行区域的检测范围
Sensors (Basel). 2016 Nov 21;16(11):1954. doi: 10.3390/s16111954.
10
Intel® RealSense™ SR300 Coded Light Depth Camera.英特尔®实感™SR300编码光深度相机。
IEEE Trans Pattern Anal Mach Intell. 2020 Oct;42(10):2333-2345. doi: 10.1109/TPAMI.2019.2915841. Epub 2019 May 10.

引用本文的文献

1
Indoor Mapping with Entertainment Devices: Evaluating the Impact of Different Mapping Strategies for Microsoft HoloLens 2 and Apple iPhone 14 Pro.使用娱乐设备进行室内映射:评估针对微软HoloLens 2和苹果iPhone 14 Pro的不同映射策略的影响。
Sensors (Basel). 2024 Feb 6;24(4):1062. doi: 10.3390/s24041062.
2
Comparative evaluation of three commercially available markerless depth sensors for close-range use in surgical simulation.三种商用无标记深度传感器在手术模拟近距离使用中的比较评估。
Int J Comput Assist Radiol Surg. 2023 Jun;18(6):1109-1118. doi: 10.1007/s11548-023-02887-1. Epub 2023 May 4.

本文引用的文献

1
Metrological Characterization and Comparison of D415, D455, L515 RealSense Devices in the Close Range.近景距离下 D415、D455、L515 RealSense 设备的计量特性与比较
Sensors (Basel). 2021 Nov 22;21(22):7770. doi: 10.3390/s21227770.
2
Translucency perception: A review.半透明度感知:综述。
J Vis. 2021 Aug 2;21(8):4. doi: 10.1167/jov.21.8.4.
3
Collaborative VR-Based 3D Labeling of Live-Captured Scenes by Remote Users.远程用户基于协作 VR 对实时捕获场景进行 3D 标记。
IEEE Comput Graph Appl. 2021 Jul-Aug;41(4):90-98. doi: 10.1109/MCG.2021.3082267. Epub 2021 Jul 15.
4
Intel® RealSense™ SR300 Coded Light Depth Camera.英特尔®实感™SR300编码光深度相机。
IEEE Trans Pattern Anal Mach Intell. 2020 Oct;42(10):2333-2345. doi: 10.1109/TPAMI.2019.2915841. Epub 2019 May 10.
5
Metrological and Critical Characterization of the Intel D415 Stereo Depth Camera.英特尔 D415 立体深度相机的计量和关键特性分析。
Sensors (Basel). 2019 Jan 25;19(3):489. doi: 10.3390/s19030489.
6
Target enhanced 3D reconstruction based on polarization-coded structured light.基于偏振编码结构光的目标增强三维重建
Opt Express. 2017 Jan 23;25(2):1173-1184. doi: 10.1364/OE.25.001173.
7
Application of lidar techniques to time-of-flight range imaging.激光雷达技术在飞行时间距离成像中的应用。
Appl Opt. 2015 Nov 20;54(33):9654-64. doi: 10.1364/AO.54.009654.
8
Optical Sensors and Methods for Underwater 3D Reconstruction.水下三维重建的光学传感器与方法
Sensors (Basel). 2015 Dec 15;15(12):31525-57. doi: 10.3390/s151229864.
9
Visual perception of materials and surfaces.材料和表面的视觉感知。
Curr Biol. 2011 Dec 20;21(24):R978-83. doi: 10.1016/j.cub.2011.11.022.