• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一个使用两个Azure Kinect传感器捕获的人体行走动作跟踪数据集。

A dataset of human body tracking of walking actions captured using two Azure Kinect sensors.

作者信息

Posner Charli, Sánchez-Mompó Adrián, Mavromatis Ioannis, Al-Ani Mustafa

机构信息

Bristol Research and Innovation Laboratory, Toshiba Europe Ltd., 32 Queen Square, Bristol, BS1 4ND, United Kingdom.

出版信息

Data Brief. 2023 Jun 22;49:109334. doi: 10.1016/j.dib.2023.109334. eCollection 2023 Aug.

DOI:10.1016/j.dib.2023.109334
PMID:37600140
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10439293/
Abstract

A dataset of body tracking information is presented. The dataset consists of 315 captured walking sequences. Each sequence is simultaneously captured by two Azure Kinect devices. The two captures are interleaved to effectively double the frame rate. Fifteen participants partook in this experiment. Each experiment consists of seven walking actions, and having three predefined trajectories per experiment. That results in 21 sequences per participant. The data were collected using the Azure Kinect Sensor SDK. They were later processed using the official tools and libraries provided by Microsoft. For each sequence and trajectory, the positions and orientations of thirty-two tracked joints were obtained and saved. The dataset is structured as follows. The experiments from each subject are saved in a single directory. Each directory contains multiple JSON files of timestamped body tracking information to enable the fusion of the two device streams. A calibration file is also provided, enabling the mapping of the coordinates between the two Azure Kinect devices capturing the data (mapping the coordinates of the device known as the Subordinate device to the Master device coordinate system). This data can be used to train neural networks for human motion prediction tasks or test pre-existing algorithms on Azure Kinect data. This dataset could also aid in gait recognition and analysis, as well as in performing action recognition and other surveillance activities. The dataset can be found at https://zenodo.org/record/7997856.

摘要

本文展示了一个人体跟踪信息数据集。该数据集包含315个捕获的行走序列。每个序列由两个Azure Kinect设备同时捕获。两次捕获交错进行,有效地将帧率提高了一倍。15名参与者参与了该实验。每个实验包含七个行走动作,每个实验有三条预定义轨迹。这导致每个参与者有21个序列。数据使用Azure Kinect传感器软件开发工具包收集。随后使用微软提供的官方工具和库进行处理。对于每个序列和轨迹,获取并保存了32个跟踪关节的位置和方向。数据集的结构如下。每个受试者的实验保存在一个单独的目录中。每个目录包含多个带有时间戳的人体跟踪信息的JSON文件,以实现两个设备流的融合。还提供了一个校准文件,用于映射捕获数据的两个Azure Kinect设备之间的坐标(将称为从属设备的坐标映射到主设备坐标系)。这些数据可用于训练用于人体运动预测任务的神经网络,或在Azure Kinect数据上测试现有的算法。该数据集还可有助于步态识别和分析,以及进行动作识别和其他监视活动。该数据集可在https://zenodo.org/record/7997856上找到。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/d2c3e3136943/gr11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/d9e8706f86f5/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/5e118f623d8b/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/55ad4a896344/gr3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/d7ec1aad8f77/gr4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/e1c5da55d3de/gr5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/2d02adaa42a5/gr6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/0006ad6c2d7e/gr7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/5e9fdb6e7354/gr8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/40464cbed2be/gr9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/c012236d4f68/gr10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/d2c3e3136943/gr11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/d9e8706f86f5/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/5e118f623d8b/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/55ad4a896344/gr3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/d7ec1aad8f77/gr4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/e1c5da55d3de/gr5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/2d02adaa42a5/gr6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/0006ad6c2d7e/gr7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/5e9fdb6e7354/gr8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/40464cbed2be/gr9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/c012236d4f68/gr10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/41f7/10439293/d2c3e3136943/gr11.jpg

相似文献

1
A dataset of human body tracking of walking actions captured using two Azure Kinect sensors.一个使用两个Azure Kinect传感器捕获的人体行走动作跟踪数据集。
Data Brief. 2023 Jun 22;49:109334. doi: 10.1016/j.dib.2023.109334. eCollection 2023 Aug.
2
Comparison of Azure Kinect overground gait spatiotemporal parameters to marker based optical motion capture.与基于标记的光学运动捕捉相比,Azure Kinect 地面步态时空参数。
Gait Posture. 2022 Jul;96:130-136. doi: 10.1016/j.gaitpost.2022.05.021. Epub 2022 May 21.
3
How the Processing Mode Influences Azure Kinect Body Tracking Results.处理模式如何影响 Azure Kinect 人体跟踪结果。
Sensors (Basel). 2023 Jan 12;23(2):878. doi: 10.3390/s23020878.
4
Microsoft Azure Kinect Calibration for Three-Dimensional Dense Point Clouds and Reliable Skeletons.微软 Azure Kinect 三维密集点云和可靠骨骼的校准。
Sensors (Basel). 2022 Jul 1;22(13):4986. doi: 10.3390/s22134986.
5
Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study.评估 Azure Kinect 和 Kinect v2 在步态分析中的姿势跟踪性能与金标准的比较:一项初步研究。
Sensors (Basel). 2020 Sep 8;20(18):5104. doi: 10.3390/s20185104.
6
Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2.使用 Azure Kinect、Kinect v2 和 Orbbec Astra Pro v2 时,摄像角度对运动学步态模式跟踪的影响。
Gait Posture. 2021 Jun;87:19-26. doi: 10.1016/j.gaitpost.2021.04.005. Epub 2021 Apr 5.
7
Ground reaction force and joint moment estimation during gait using an Azure Kinect-driven musculoskeletal modeling approach.基于 Azure Kinect 驱动的肌肉骨骼建模方法的步态中地面反作用力和关节力矩估计。
Gait Posture. 2022 Jun;95:49-55. doi: 10.1016/j.gaitpost.2022.04.005. Epub 2022 Apr 9.
8
Evaluating Automatic Body Orientation Detection for Indoor Location from Skeleton Tracking Data to Detect Socially Occupied Spaces Using the Kinect v2, Azure Kinect and Zed 2i.评估基于骨骼跟踪数据的自动人体方向检测,以使用 Kinect v2、Azure Kinect 和 Zed 2i 检测社交占用空间的室内定位。
Sensors (Basel). 2022 May 17;22(10):3798. doi: 10.3390/s22103798.
9
Postural control assessment via Microsoft Azure Kinect DK: An evaluation study.基于微软 Azure Kinect DK 的姿势控制评估:一项评估研究。
Comput Methods Programs Biomed. 2021 Sep;209:106324. doi: 10.1016/j.cmpb.2021.106324. Epub 2021 Aug 4.
10
Accuracy of the Microsoft Kinect for measuring gait parameters during treadmill walking.微软Kinect在测量跑步机行走时步态参数方面的准确性。
Gait Posture. 2015 Jul;42(2):145-51. doi: 10.1016/j.gaitpost.2015.05.002. Epub 2015 May 14.