• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

评估基于骨骼跟踪数据的自动人体方向检测,以使用 Kinect v2、Azure Kinect 和 Zed 2i 检测社交占用空间的室内定位。

Evaluating Automatic Body Orientation Detection for Indoor Location from Skeleton Tracking Data to Detect Socially Occupied Spaces Using the Kinect v2, Azure Kinect and Zed 2i.

机构信息

Spatial Intelligence Lab, Institute for Geoinformatics, University of Münster, 48149 Muenster, Germany.

出版信息

Sensors (Basel). 2022 May 17;22(10):3798. doi: 10.3390/s22103798.

DOI:10.3390/s22103798
PMID:35632211
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9146255/
Abstract

Analysing the dynamics in social interactions in indoor spaces entails evaluating spatial-temporal variables from the event, such as location and time. Additionally, social interactions include invisible spaces that we unconsciously acknowledge due to social constraints, e.g., space between people having a conversation with each other. Nevertheless, current sensor arrays focus on detecting the physically occupied spaces from social interactions, i.e., areas inhabited by physically measurable objects. Our goal is to detect the socially occupied spaces, i.e., spaces not physically occupied by subjects and objects but inhabited by the interaction they sustain. We evaluate the social representation of the space structure between two or more active participants, so-called F-Formation for small gatherings. We propose calculating body orientation and location from skeleton joint data sets by integrating depth cameras. The body orientation is derived by integrating the shoulders and spine joint data with head/face rotation data and spatial-temporal information from trajectories. From the physically occupied measurements, we can detect socially occupied spaces. In our user study implementing the system, we compared the capabilities and skeleton tracking datasets from three depth camera sensors, the Kinect v2, Azure Kinect, and Zed 2i. We collected 32 walking patterns for individual and dyad configurations and evaluated the system's accuracy regarding the intended and socially accepted orientations. Experimental results show accuracy above 90% for the Kinect v2, 96% for the Azure Kinect, and 89% for the Zed 2i for assessing socially relevant body orientation. Our algorithm contributes to the anonymous and automated assessment of socially occupied spaces. The depth sensor system is promising in detecting more complex social structures. These findings impact research areas that study group interactions within complex indoor settings.

摘要

分析室内空间中的社交互动动态需要评估事件的时空变量,例如位置和时间。此外,社交互动还包括我们由于社会约束而无意识地感知到的无形空间,例如正在交谈的两个人之间的空间。然而,当前的传感器阵列专注于检测社交互动中的物理占用空间,即由物理可测量物体占据的区域。我们的目标是检测社会占用空间,即没有被主体和物体物理占用但被它们维持的互动所占据的空间。我们评估两个或多个活跃参与者之间的空间结构的社交表示形式,即所谓的小团体的 F-Formation。我们提出通过集成深度相机从骨骼关节数据集计算身体方向和位置。身体方向是通过将肩部和脊柱关节数据与头部/面部旋转数据以及来自轨迹的时空信息集成来得出的。从物理占用的测量中,我们可以检测到社会占用的空间。在我们实施系统的用户研究中,我们比较了三个深度相机传感器(Kinect v2、Azure Kinect 和 Zed 2i)的能力和骨骼跟踪数据集。我们收集了 32 个单人及双人配置的行走模式,并评估了系统在预期和社会可接受的方向上的准确性。实验结果表明,Kinect v2 评估社会相关身体方向的准确性为 90%以上,Azure Kinect 为 96%,Zed 2i 为 89%。我们的算法有助于对社会占用空间进行匿名和自动评估。深度传感器系统在检测更复杂的社会结构方面具有很大的潜力。这些发现影响了研究复杂室内环境中群体互动的研究领域。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/81c81241044f/sensors-22-03798-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/8aa16cd29be5/sensors-22-03798-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/66b43600c3d8/sensors-22-03798-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/541ccaaf91d2/sensors-22-03798-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/662a81abc942/sensors-22-03798-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/307bdca14542/sensors-22-03798-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/ef592694ba2c/sensors-22-03798-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/f9e929ac0707/sensors-22-03798-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/d84996b24bb0/sensors-22-03798-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/fdf70bcc5bea/sensors-22-03798-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/7c28cb43256b/sensors-22-03798-g010a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/2096099f21c4/sensors-22-03798-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/1d13a3d250be/sensors-22-03798-g012a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/3530eee20422/sensors-22-03798-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/b6366c1e600a/sensors-22-03798-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/3e08dbe08eae/sensors-22-03798-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/81c81241044f/sensors-22-03798-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/8aa16cd29be5/sensors-22-03798-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/66b43600c3d8/sensors-22-03798-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/541ccaaf91d2/sensors-22-03798-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/662a81abc942/sensors-22-03798-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/307bdca14542/sensors-22-03798-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/ef592694ba2c/sensors-22-03798-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/f9e929ac0707/sensors-22-03798-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/d84996b24bb0/sensors-22-03798-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/fdf70bcc5bea/sensors-22-03798-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/7c28cb43256b/sensors-22-03798-g010a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/2096099f21c4/sensors-22-03798-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/1d13a3d250be/sensors-22-03798-g012a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/3530eee20422/sensors-22-03798-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/b6366c1e600a/sensors-22-03798-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/3e08dbe08eae/sensors-22-03798-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/807f/9146255/81c81241044f/sensors-22-03798-g016.jpg

相似文献

1
Evaluating Automatic Body Orientation Detection for Indoor Location from Skeleton Tracking Data to Detect Socially Occupied Spaces Using the Kinect v2, Azure Kinect and Zed 2i.评估基于骨骼跟踪数据的自动人体方向检测,以使用 Kinect v2、Azure Kinect 和 Zed 2i 检测社交占用空间的室内定位。
Sensors (Basel). 2022 May 17;22(10):3798. doi: 10.3390/s22103798.
2
Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2.使用 Azure Kinect、Kinect v2 和 Orbbec Astra Pro v2 时,摄像角度对运动学步态模式跟踪的影响。
Gait Posture. 2021 Jun;87:19-26. doi: 10.1016/j.gaitpost.2021.04.005. Epub 2021 Apr 5.
3
Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study.评估 Azure Kinect 和 Kinect v2 在步态分析中的姿势跟踪性能与金标准的比较:一项初步研究。
Sensors (Basel). 2020 Sep 8;20(18):5104. doi: 10.3390/s20185104.
4
Evaluating the Accuracy of the Azure Kinect and Kinect v2.评估 Azure Kinect 和 Kinect v2 的准确性。
Sensors (Basel). 2022 Mar 23;22(7):2469. doi: 10.3390/s22072469.
5
Ground reaction force and joint moment estimation during gait using an Azure Kinect-driven musculoskeletal modeling approach.基于 Azure Kinect 驱动的肌肉骨骼建模方法的步态中地面反作用力和关节力矩估计。
Gait Posture. 2022 Jun;95:49-55. doi: 10.1016/j.gaitpost.2022.04.005. Epub 2022 Apr 9.
6
Comparison of Azure Kinect overground gait spatiotemporal parameters to marker based optical motion capture.与基于标记的光学运动捕捉相比,Azure Kinect 地面步态时空参数。
Gait Posture. 2022 Jul;96:130-136. doi: 10.1016/j.gaitpost.2022.05.021. Epub 2022 May 21.
7
Markerless Knee Joint Position Measurement Using Depth Data during Stair Walking.使用深度数据在楼梯行走过程中进行无标记膝关节位置测量。
Sensors (Basel). 2017 Nov 22;17(11):2698. doi: 10.3390/s17112698.
8
How the Processing Mode Influences Azure Kinect Body Tracking Results.处理模式如何影响 Azure Kinect 人体跟踪结果。
Sensors (Basel). 2023 Jan 12;23(2):878. doi: 10.3390/s23020878.
9
Three-dimensional cameras and skeleton pose tracking for physical function assessment: A review of uses, validity, current developments and Kinect alternatives.用于身体功能评估的三维相机和骨骼姿势跟踪:用途、有效性、当前进展及Kinect替代方案综述
Gait Posture. 2019 Feb;68:193-200. doi: 10.1016/j.gaitpost.2018.11.029. Epub 2018 Nov 22.
10
Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2.评估 Azure Kinect 及其与 Kinect V1 和 Kinect V2 的比较。
Sensors (Basel). 2021 Jan 8;21(2):413. doi: 10.3390/s21020413.

引用本文的文献

1
Analysis of Kinect-Based Human Motion Capture Accuracy Using Skeletal Cosine Similarity Metrics.使用骨骼余弦相似度指标分析基于Kinect的人体运动捕捉精度
Sensors (Basel). 2025 Feb 10;25(4):1047. doi: 10.3390/s25041047.
2
On the Evaluation of Diverse Vision Systems towards Detecting Human Pose in Collaborative Robot Applications.面向协作机器人应用中人体姿态检测的多样化视觉系统评估
Sensors (Basel). 2024 Jan 17;24(2):578. doi: 10.3390/s24020578.
3
The Performance of Inertial Measurement Unit Sensors on Various Hardware Platforms for Binaural Head-Tracking Applications.

本文引用的文献

1
Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2.使用 Azure Kinect、Kinect v2 和 Orbbec Astra Pro v2 时,摄像角度对运动学步态模式跟踪的影响。
Gait Posture. 2021 Jun;87:19-26. doi: 10.1016/j.gaitpost.2021.04.005. Epub 2021 Apr 5.
2
Improved Action Recognition with Separable Spatio-Temporal Attention Using Alternative Skeletal and Video Pre-Processing.使用替代骨骼和视频预处理的可分离时空注意力提高动作识别
Sensors (Basel). 2021 Feb 2;21(3):1005. doi: 10.3390/s21031005.
3
Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2.
各种硬件平台上的惯性测量单元传感器在双耳头部跟踪应用中的性能。
Sensors (Basel). 2023 Jan 12;23(2):872. doi: 10.3390/s23020872.
4
Reliability of 3D Depth Motion Sensors for Capturing Upper Body Motions and Assessing the Quality of Wheelchair Transfers.3D 深度运动传感器在捕捉上半身运动和评估轮椅转移质量方面的可靠性。
Sensors (Basel). 2022 Jun 30;22(13):4977. doi: 10.3390/s22134977.
5
Intelligent Sensors for Human Motion Analysis.用于人体运动分析的智能传感器。
Sensors (Basel). 2022 Jun 30;22(13):4952. doi: 10.3390/s22134952.
评估 Azure Kinect 及其与 Kinect V1 和 Kinect V2 的比较。
Sensors (Basel). 2021 Jan 8;21(2):413. doi: 10.3390/s21020413.
4
Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study.评估 Azure Kinect 和 Kinect v2 在步态分析中的姿势跟踪性能与金标准的比较:一项初步研究。
Sensors (Basel). 2020 Sep 8;20(18):5104. doi: 10.3390/s20185104.
5
A multi-camera dataset for depth estimation in an indoor scenario.一个用于室内场景深度估计的多摄像头数据集。
Data Brief. 2019 Oct 7;27:104619. doi: 10.1016/j.dib.2019.104619. eCollection 2019 Dec.
6
A social interaction field model accurately identifies static and dynamic social groupings.社交互动场模型能准确识别静态和动态社交群体。
Nat Hum Behav. 2019 Aug;3(8):847-855. doi: 10.1038/s41562-019-0618-2. Epub 2019 Jun 10.