• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

处理模式如何影响 Azure Kinect 人体跟踪结果。

How the Processing Mode Influences Azure Kinect Body Tracking Results.

机构信息

Assistance Systems and Medical Device Technology, Department for Health Services Research, School of Medicine and Health Sciences, Carl von Ossietzky University, Ammerländer Heerstraße 114-118, 26129 Oldenburg, Germany.

Geriatric Medicine, Department for Health Services Research, School of Medicine and Health Sciences, Carl von Ossietzky University, Ammerländer Heerstraße 114-118, 26129 Oldenburg, Germany.

出版信息

Sensors (Basel). 2023 Jan 12;23(2):878. doi: 10.3390/s23020878.

DOI:10.3390/s23020878
PMID:36679675
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9860777/
Abstract

The Azure Kinect DK is an RGB-D-camera popular in research and studies with humans. For good scientific practice, it is relevant that Azure Kinect yields consistent and reproducible results. We noticed the yielded results were inconsistent. Therefore, we examined 100 body tracking runs per processing mode provided by the Azure Kinect Body Tracking SDK on two different computers using a prerecorded video. We compared those runs with respect to spatiotemporal progression (spatial distribution of joint positions per processing mode and run), derived parameters (bone length), and differences between the computers. We found a previously undocumented converging behavior of joint positions at the start of the body tracking. Euclidean distances of joint positions varied clinically relevantly with up to 87 mm between runs for CUDA and TensorRT; CPU and DirectML had no differences on the same computer. Additionally, we found noticeable differences between two computers. Therefore, we recommend choosing the processing mode carefully, reporting the processing mode, and performing all analyses on the same computer to ensure reproducible results when using Azure Kinect and its body tracking in research. Consequently, results from previous studies with Azure Kinect should be reevaluated, and until then, their findings should be interpreted with caution.

摘要

Azure Kinect DK 是一款在人体研究中广受欢迎的 RGB-D 相机。为了良好的科学实践,Azure Kinect 应产生一致且可重复的结果。我们注意到得到的结果不一致。因此,我们使用预先录制的视频,在两台不同的计算机上,按照 Azure Kinect 人体跟踪 SDK 提供的 100 种处理模式检查了 100 次人体跟踪运行。我们比较了这些运行在时空进展(每种处理模式和运行的关节位置的空间分布)、导出参数(骨骼长度)和计算机之间的差异。我们发现人体跟踪开始时关节位置存在以前未记录的收敛行为。对于 CUDA 和 TensorRT,关节位置的欧几里得距离在运行之间变化很大,最大可达 87mm;在同一台计算机上,CPU 和 DirectML 没有差异。此外,我们还发现两台计算机之间存在明显差异。因此,我们建议在使用 Azure Kinect 及其人体跟踪进行研究时,仔细选择处理模式、报告处理模式,并在同一台计算机上执行所有分析,以确保可重复的结果。因此,应该重新评估之前使用 Azure Kinect 进行的研究结果,在重新评估之前,应谨慎解释其发现。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/085e8b7b34c9/sensors-23-00878-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/c7abb615b2a4/sensors-23-00878-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/5f1501f19db5/sensors-23-00878-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/4caf247765bd/sensors-23-00878-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/6de5ecc889b9/sensors-23-00878-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/d2d527466fd8/sensors-23-00878-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/1fe2f484ad81/sensors-23-00878-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/8519b6d58aee/sensors-23-00878-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/724113ac9eab/sensors-23-00878-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/592213067828/sensors-23-00878-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/b245b409b739/sensors-23-00878-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/d9b6e799ff08/sensors-23-00878-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/dd9f7685eca6/sensors-23-00878-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/899f7f6b4d9b/sensors-23-00878-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/c786a4d73d79/sensors-23-00878-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/6624ba0c2fd1/sensors-23-00878-g010a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/65c561ded8e4/sensors-23-00878-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/31341e379a1b/sensors-23-00878-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/4c95c795cd72/sensors-23-00878-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/085e8b7b34c9/sensors-23-00878-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/c7abb615b2a4/sensors-23-00878-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/5f1501f19db5/sensors-23-00878-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/4caf247765bd/sensors-23-00878-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/6de5ecc889b9/sensors-23-00878-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/d2d527466fd8/sensors-23-00878-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/1fe2f484ad81/sensors-23-00878-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/8519b6d58aee/sensors-23-00878-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/724113ac9eab/sensors-23-00878-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/592213067828/sensors-23-00878-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/b245b409b739/sensors-23-00878-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/d9b6e799ff08/sensors-23-00878-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/dd9f7685eca6/sensors-23-00878-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/899f7f6b4d9b/sensors-23-00878-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/c786a4d73d79/sensors-23-00878-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/6624ba0c2fd1/sensors-23-00878-g010a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/65c561ded8e4/sensors-23-00878-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/31341e379a1b/sensors-23-00878-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/4c95c795cd72/sensors-23-00878-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07a4/9860777/085e8b7b34c9/sensors-23-00878-g014.jpg

相似文献

1
How the Processing Mode Influences Azure Kinect Body Tracking Results.处理模式如何影响 Azure Kinect 人体跟踪结果。
Sensors (Basel). 2023 Jan 12;23(2):878. doi: 10.3390/s23020878.
2
Comparison of Azure Kinect overground gait spatiotemporal parameters to marker based optical motion capture.与基于标记的光学运动捕捉相比,Azure Kinect 地面步态时空参数。
Gait Posture. 2022 Jul;96:130-136. doi: 10.1016/j.gaitpost.2022.05.021. Epub 2022 May 21.
3
Towards comparable quality-assured Azure Kinect body tracking results in a study setting-Influence of light.在研究环境中实现可比的经过质量保证的 Azure Kinect 人体跟踪结果——光照的影响。
PLoS One. 2024 Aug 9;19(8):e0308416. doi: 10.1371/journal.pone.0308416. eCollection 2024.
4
Postural control assessment via Microsoft Azure Kinect DK: An evaluation study.基于微软 Azure Kinect DK 的姿势控制评估:一项评估研究。
Comput Methods Programs Biomed. 2021 Sep;209:106324. doi: 10.1016/j.cmpb.2021.106324. Epub 2021 Aug 4.
5
Evaluating Automatic Body Orientation Detection for Indoor Location from Skeleton Tracking Data to Detect Socially Occupied Spaces Using the Kinect v2, Azure Kinect and Zed 2i.评估基于骨骼跟踪数据的自动人体方向检测,以使用 Kinect v2、Azure Kinect 和 Zed 2i 检测社交占用空间的室内定位。
Sensors (Basel). 2022 May 17;22(10):3798. doi: 10.3390/s22103798.
6
Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study.评估 Azure Kinect 和 Kinect v2 在步态分析中的姿势跟踪性能与金标准的比较:一项初步研究。
Sensors (Basel). 2020 Sep 8;20(18):5104. doi: 10.3390/s20185104.
7
A dataset of human body tracking of walking actions captured using two Azure Kinect sensors.一个使用两个Azure Kinect传感器捕获的人体行走动作跟踪数据集。
Data Brief. 2023 Jun 22;49:109334. doi: 10.1016/j.dib.2023.109334. eCollection 2023 Aug.
8
Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2.使用 Azure Kinect、Kinect v2 和 Orbbec Astra Pro v2 时,摄像角度对运动学步态模式跟踪的影响。
Gait Posture. 2021 Jun;87:19-26. doi: 10.1016/j.gaitpost.2021.04.005. Epub 2021 Apr 5.
9
Wrist motion assessment using Microsoft Azure Kinect DK: A reliability study in healthy individuals.使用微软Azure Kinect DK进行手腕运动评估:一项针对健康个体的可靠性研究。
Adv Clin Exp Med. 2023 Feb;32(2):203-209. doi: 10.17219/acem/152884.
10
Ground reaction force and joint moment estimation during gait using an Azure Kinect-driven musculoskeletal modeling approach.基于 Azure Kinect 驱动的肌肉骨骼建模方法的步态中地面反作用力和关节力矩估计。
Gait Posture. 2022 Jun;95:49-55. doi: 10.1016/j.gaitpost.2022.04.005. Epub 2022 Apr 9.

引用本文的文献

1
Analysis of Kinect-Based Human Motion Capture Accuracy Using Skeletal Cosine Similarity Metrics.使用骨骼余弦相似度指标分析基于Kinect的人体运动捕捉精度
Sensors (Basel). 2025 Feb 10;25(4):1047. doi: 10.3390/s25041047.
2
Analysis of Factors Influencing the Precision of Body Tracking Outcomes in Industrial Gesture Control.分析影响工业手势控制中人体跟踪结果精度的因素。
Sensors (Basel). 2024 Sep 12;24(18):5919. doi: 10.3390/s24185919.
3
Towards comparable quality-assured Azure Kinect body tracking results in a study setting-Influence of light.

本文引用的文献

1
Evaluation of Arm Swing Features and Asymmetry during Gait in Parkinson's Disease Using the Azure Kinect Sensor.使用 Azure Kinect 传感器评估帕金森病患者步态中的手臂摆动特征和不对称性。
Sensors (Basel). 2022 Aug 21;22(16):6282. doi: 10.3390/s22166282.
2
Comparison of Azure Kinect overground gait spatiotemporal parameters to marker based optical motion capture.与基于标记的光学运动捕捉相比,Azure Kinect 地面步态时空参数。
Gait Posture. 2022 Jul;96:130-136. doi: 10.1016/j.gaitpost.2022.05.021. Epub 2022 May 21.
3
Evaluating the Accuracy of the Azure Kinect and Kinect v2.
在研究环境中实现可比的经过质量保证的 Azure Kinect 人体跟踪结果——光照的影响。
PLoS One. 2024 Aug 9;19(8):e0308416. doi: 10.1371/journal.pone.0308416. eCollection 2024.
4
Evaluating Desk-Assisted Standing Techniques for Simulated Pregnant Conditions: An Experimental Study Using a Maternity-Simulation Jacket.评估模拟怀孕状态下的桌面辅助站立技术:一项使用孕妇模拟夹克的实验研究。
Healthcare (Basel). 2024 May 1;12(9):931. doi: 10.3390/healthcare12090931.
评估 Azure Kinect 和 Kinect v2 的准确性。
Sensors (Basel). 2022 Mar 23;22(7):2469. doi: 10.3390/s22072469.
4
Placement Recommendations for Single Kinect-Based Motion Capture System in Unilateral Dynamic Motion Analysis.基于单台Kinect的运动捕捉系统在单侧动态运动分析中的放置建议
Healthcare (Basel). 2021 Aug 21;9(8):1076. doi: 10.3390/healthcare9081076.
5
Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2.使用 Azure Kinect、Kinect v2 和 Orbbec Astra Pro v2 时,摄像角度对运动学步态模式跟踪的影响。
Gait Posture. 2021 Jun;87:19-26. doi: 10.1016/j.gaitpost.2021.04.005. Epub 2021 Apr 5.
6
Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2.评估 Azure Kinect 及其与 Kinect V1 和 Kinect V2 的比较。
Sensors (Basel). 2021 Jan 8;21(2):413. doi: 10.3390/s21020413.
7
Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study.评估 Azure Kinect 和 Kinect v2 在步态分析中的姿势跟踪性能与金标准的比较:一项初步研究。
Sensors (Basel). 2020 Sep 8;20(18):5104. doi: 10.3390/s20185104.
8
Confidence analysis of standard deviational ellipse and its extension into higher dimensional euclidean space.标准差椭圆的置信度分析及其向高维欧几里得空间的扩展。
PLoS One. 2015 Mar 13;10(3):e0118537. doi: 10.1371/journal.pone.0118537. eCollection 2015.