• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

评估 Azure Kinect 及其与 Kinect V1 和 Kinect V2 的比较。

Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2.

机构信息

Institute of Robotics and Cybernetics, Faculty of Electrical Engineering and Information Technology STU in Bratislava, Ilkovičova 3, 812 19 Bratislava, Slovakia.

出版信息

Sensors (Basel). 2021 Jan 8;21(2):413. doi: 10.3390/s21020413.

DOI:10.3390/s21020413
PMID:33430149
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7827245/
Abstract

The Azure Kinect is the successor of Kinect v1 and Kinect v2. In this paper we perform brief data analysis and comparison of all Kinect versions with focus on precision (repeatability) and various aspects of noise of these three sensors. Then we thoroughly evaluate the new Azure Kinect; namely its warm-up time, precision (and sources of its variability), accuracy (thoroughly, using a robotic arm), reflectivity (using 18 different materials), and the multipath and flying pixel phenomenon. Furthermore, we validate its performance in both indoor and outdoor environments, including direct and indirect sun conditions. We conclude with a discussion on its improvements in the context of the evolution of the Kinect sensor. It was shown that it is crucial to choose well designed experiments to measure accuracy, since the RGB and depth camera are not aligned. Our measurements confirm the officially stated values, namely standard deviation ≤17 mm, and distance error <11 mm in up to 3.5 meters distance from the sensor in all four supported modes. The device, however, has to be warmed up for at least 40-50 min to give stable results. Due to the time-of-flight technology, the Azure Kinect cannot be reliably used in direct sunlight. Therefore, it is convenient mostly for indoor applications.

摘要

Azure Kinect 是 Kinect v1 和 Kinect v2 的后继产品。在本文中,我们对所有 Kinect 版本进行了简要的数据分析和比较,重点关注这些三个传感器的精度(可重复性)和各种噪声方面。然后,我们彻底评估了新的 Azure Kinect;即其预热时间、精度(及其可变性的来源)、准确性(使用机械臂进行彻底评估)、反射率(使用 18 种不同的材料)以及多径和飞像素现象。此外,我们还在室内和室外环境中验证了其性能,包括直接和间接阳光条件。最后,我们讨论了它在 Kinect 传感器的发展背景下的改进。事实证明,选择精心设计的实验来测量精度非常重要,因为 RGB 和深度相机未对齐。我们的测量结果证实了官方声明的值,即在所有四个支持的模式下,标准偏差≤17mm,距离误差<11mm,距离传感器最远可达 3.5 米。然而,该设备必须预热至少 40-50 分钟才能给出稳定的结果。由于飞行时间技术,Azure Kinect 不能在直射阳光下可靠使用。因此,它主要适用于室内应用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/83c2174b7ef0/sensors-21-00413-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/64c7fb7d389f/sensors-21-00413-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/276bdc82bf44/sensors-21-00413-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/7ed470335f4b/sensors-21-00413-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/3bfbae9a7a80/sensors-21-00413-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/e26d0f718526/sensors-21-00413-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/d95302847596/sensors-21-00413-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/2d161b3a5520/sensors-21-00413-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/f2ee003ba4e8/sensors-21-00413-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/5f16e3f2e830/sensors-21-00413-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/b347ac145cf6/sensors-21-00413-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/664fdafbc632/sensors-21-00413-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/af06c0b3825a/sensors-21-00413-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/edbdb8c848d7/sensors-21-00413-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/8fa55c2b5042/sensors-21-00413-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/c0b539e5832d/sensors-21-00413-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/ffdfeda7b0a9/sensors-21-00413-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/b85c38f5e459/sensors-21-00413-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/ea8f39218e7a/sensors-21-00413-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/b27f9a13884c/sensors-21-00413-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/0b8f7fa6b088/sensors-21-00413-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/e699f2fa9d37/sensors-21-00413-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/01eab8509a4d/sensors-21-00413-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/25bd75c672d0/sensors-21-00413-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/f32e9f6b9b11/sensors-21-00413-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/9fa8b1b9db0b/sensors-21-00413-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/3975e260433b/sensors-21-00413-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/8350dc0dfc80/sensors-21-00413-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/4e213659d30f/sensors-21-00413-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/8e546b6e2c1a/sensors-21-00413-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/6ce07f1b5eef/sensors-21-00413-g030.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/83c2174b7ef0/sensors-21-00413-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/64c7fb7d389f/sensors-21-00413-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/276bdc82bf44/sensors-21-00413-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/7ed470335f4b/sensors-21-00413-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/3bfbae9a7a80/sensors-21-00413-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/e26d0f718526/sensors-21-00413-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/d95302847596/sensors-21-00413-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/2d161b3a5520/sensors-21-00413-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/f2ee003ba4e8/sensors-21-00413-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/5f16e3f2e830/sensors-21-00413-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/b347ac145cf6/sensors-21-00413-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/664fdafbc632/sensors-21-00413-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/af06c0b3825a/sensors-21-00413-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/edbdb8c848d7/sensors-21-00413-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/8fa55c2b5042/sensors-21-00413-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/c0b539e5832d/sensors-21-00413-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/ffdfeda7b0a9/sensors-21-00413-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/b85c38f5e459/sensors-21-00413-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/ea8f39218e7a/sensors-21-00413-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/b27f9a13884c/sensors-21-00413-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/0b8f7fa6b088/sensors-21-00413-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/e699f2fa9d37/sensors-21-00413-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/01eab8509a4d/sensors-21-00413-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/25bd75c672d0/sensors-21-00413-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/f32e9f6b9b11/sensors-21-00413-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/9fa8b1b9db0b/sensors-21-00413-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/3975e260433b/sensors-21-00413-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/8350dc0dfc80/sensors-21-00413-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/4e213659d30f/sensors-21-00413-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/8e546b6e2c1a/sensors-21-00413-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/6ce07f1b5eef/sensors-21-00413-g030.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d271/7827245/83c2174b7ef0/sensors-21-00413-g031.jpg

相似文献

1
Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2.评估 Azure Kinect 及其与 Kinect V1 和 Kinect V2 的比较。
Sensors (Basel). 2021 Jan 8;21(2):413. doi: 10.3390/s21020413.
2
Evaluating the Accuracy of the Azure Kinect and Kinect v2.评估 Azure Kinect 和 Kinect v2 的准确性。
Sensors (Basel). 2022 Mar 23;22(7):2469. doi: 10.3390/s22072469.
3
Indoor 3D Reconstruction of Buildings via Azure Kinect RGB-D Camera.基于 Azure Kinect RGB-D 相机的建筑物室内三维重建。
Sensors (Basel). 2022 Nov 27;22(23):9222. doi: 10.3390/s22239222.
4
Evaluating Automatic Body Orientation Detection for Indoor Location from Skeleton Tracking Data to Detect Socially Occupied Spaces Using the Kinect v2, Azure Kinect and Zed 2i.评估基于骨骼跟踪数据的自动人体方向检测,以使用 Kinect v2、Azure Kinect 和 Zed 2i 检测社交占用空间的室内定位。
Sensors (Basel). 2022 May 17;22(10):3798. doi: 10.3390/s22103798.
5
Microsoft Azure Kinect Calibration for Three-Dimensional Dense Point Clouds and Reliable Skeletons.微软 Azure Kinect 三维密集点云和可靠骨骼的校准。
Sensors (Basel). 2022 Jul 1;22(13):4986. doi: 10.3390/s22134986.
6
3D object detection through fog and occlusion: passive integral imaging vs active (LiDAR) sensing.通过雾和遮挡进行3D目标检测:被动积分成像与主动(激光雷达)传感
Opt Express. 2023 Jan 2;31(1):479-491. doi: 10.1364/OE.478125.
7
SU-E-I-92: Accuracy Evaluation of Depth Data in Microsoft Kinect.SU-E-I-92:微软Kinect中深度数据的准确性评估
Med Phys. 2012 Jun;39(6Part5):3646. doi: 10.1118/1.4734809.
8
Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study.评估 Azure Kinect 和 Kinect v2 在步态分析中的姿势跟踪性能与金标准的比较:一项初步研究。
Sensors (Basel). 2020 Sep 8;20(18):5104. doi: 10.3390/s20185104.
9
Ground reaction force and joint moment estimation during gait using an Azure Kinect-driven musculoskeletal modeling approach.基于 Azure Kinect 驱动的肌肉骨骼建模方法的步态中地面反作用力和关节力矩估计。
Gait Posture. 2022 Jun;95:49-55. doi: 10.1016/j.gaitpost.2022.04.005. Epub 2022 Apr 9.
10
Postural control assessment via Microsoft Azure Kinect DK: An evaluation study.基于微软 Azure Kinect DK 的姿势控制评估:一项评估研究。
Comput Methods Programs Biomed. 2021 Sep;209:106324. doi: 10.1016/j.cmpb.2021.106324. Epub 2021 Aug 4.

引用本文的文献

1
The State of the Art and Potentialities of UAV-Based 3D Measurement Solutions in the Monitoring and Fault Diagnosis of Quasi-Brittle Structures.基于无人机的三维测量解决方案在准脆性结构监测与故障诊断中的技术现状与潜力
Sensors (Basel). 2025 Aug 19;25(16):5134. doi: 10.3390/s25165134.
2
Validity and reliability of single camera markerless motion capture systems with RGB-D sensors for measuring shoulder range-of-motion: a systematic review.用于测量肩部活动范围的配备RGB-D传感器的单摄像头无标记运动捕捉系统的有效性和可靠性:一项系统评价
Front Bioeng Biotechnol. 2025 May 23;13:1570637. doi: 10.3389/fbioe.2025.1570637. eCollection 2025.
3

本文引用的文献

1
Using New Camera-Based Technologies for Gait Analysis in Older Adults in Comparison to the Established GAITRite System.使用基于新型摄像机的技术与既定的 GAITRite 系统相比,对老年人进行步态分析。
Sensors (Basel). 2019 Dec 24;20(1):125. doi: 10.3390/s20010125.
2
Suitability of the Kinect Sensor and Leap Motion Controller-A Literature Review.《Kinect 传感器和 Leap Motion 控制器的适用性——文献综述》。
Sensors (Basel). 2019 Mar 2;19(5):1072. doi: 10.3390/s19051072.
3
Calibration of Kinect for Xbox One and Comparison between the Two Generations of Microsoft Sensors.
Analysis of Kinect-Based Human Motion Capture Accuracy Using Skeletal Cosine Similarity Metrics.
使用骨骼余弦相似度指标分析基于Kinect的人体运动捕捉精度
Sensors (Basel). 2025 Feb 10;25(4):1047. doi: 10.3390/s25041047.
4
Improving 3D Reconstruction Through RGB-D Sensor Noise Modeling.通过RGB-D传感器噪声建模改进三维重建
Sensors (Basel). 2025 Feb 5;25(3):950. doi: 10.3390/s25030950.
5
Estimation of gait parameters in healthy and hemiplegic individuals using Azure Kinect: a comparative study with the optoelectronic system.使用Azure Kinect评估健康个体和偏瘫个体的步态参数:与光电系统的对比研究。
Front Bioeng Biotechnol. 2024 Nov 25;12:1449680. doi: 10.3389/fbioe.2024.1449680. eCollection 2024.
6
HoLLiECares - Development of a multi-functional robot for professional care.HoLLiECares——一款用于专业护理的多功能机器人的研发。
Front Robot AI. 2024 Oct 9;11:1325143. doi: 10.3389/frobt.2024.1325143. eCollection 2024.
7
A Study on the 3D Reconstruction Strategy of a Sheep Body Based on a Kinect v2 Depth Camera Array.基于Kinect v2深度相机阵列的绵羊身体三维重建策略研究
Animals (Basel). 2024 Aug 23;14(17):2457. doi: 10.3390/ani14172457.
8
Towards comparable quality-assured Azure Kinect body tracking results in a study setting-Influence of light.在研究环境中实现可比的经过质量保证的 Azure Kinect 人体跟踪结果——光照的影响。
PLoS One. 2024 Aug 9;19(8):e0308416. doi: 10.1371/journal.pone.0308416. eCollection 2024.
9
3D object reconstruction: A comprehensive view-dependent dataset.3D对象重建:一个全面的视图相关数据集。
Data Brief. 2024 Jun 2;55:110569. doi: 10.1016/j.dib.2024.110569. eCollection 2024 Aug.
10
Visual Feedback and Guided Balance Training in an Immersive Virtual Reality Environment for Lower Extremity Rehabilitation.沉浸式虚拟现实环境中用于下肢康复的视觉反馈与引导平衡训练
Comput Graph. 2024 Apr;119. doi: 10.1016/j.cag.2024.01.007. Epub 2024 Feb 1.
Xbox One的Kinect校准及两代微软传感器的比较。
Sensors (Basel). 2015 Oct 30;15(11):27569-89. doi: 10.3390/s151127569.
4
Statistical analysis-based error models for the Microsoft Kinect(TM) depth sensor.基于统计分析的微软Kinect深度传感器误差模型
Sensors (Basel). 2014 Sep 18;14(9):17430-50. doi: 10.3390/s140917430.