• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用微软Kinect V2传感器对运动质量进行自动分类。

Automated classification of movement quality using the Microsoft Kinect V2 sensor.

作者信息

Dajime Peter Fermin, Smith Heather, Zhang Yanxin

机构信息

Department of Exercise Sciences, University of Auckland, New Zealand.

Department of Exercise Sciences, University of Auckland, New Zealand.

出版信息

Comput Biol Med. 2020 Oct;125:104021. doi: 10.1016/j.compbiomed.2020.104021. Epub 2020 Sep 29.

DOI:10.1016/j.compbiomed.2020.104021
PMID:33011646
Abstract

Practitioners commonly perform movement quality assessment through qualitative assessment protocols, which can be time-intensive and prone to inter-rater measurement bias. The advent of portable and inexpensive marker-less motion capture systems can improve assessment through objective joint kinematic analysis. The current study aimed to evaluate various machine learning models that used kinematic features from Kinect position data to classify a performer's Movement Competency Screen (MCS) score. A Kinect V2 sensor collected position data from 31 physically active males as they performed bilateral squat, forward lunge, and single-leg squat; and the movement quality was rated according to the MCS criteria. Features were extracted and selected from domain knowledge-based kinematic variables as model input. Multiclass logistic regression (MLR) was then performed to translate joint kinematics into MCS score. Performance indicators were calculated after a 10-fold cross validation of each model developed from Kinect-based kinematic variables. The analyses revealed that the models' sensitivity, specificity, and accuracy ranged from 0.66 to 0.89, 0.58 to 0.86, and 0.74 to 0.85, respectively. In conclusion, the Kinect-based automated movement quality assessment is a suitable, novel, and practical approach to movement quality assessment.

摘要

从业者通常通过定性评估方案来进行运动质量评估,这种方法可能耗时较长,并且容易出现评分者间的测量偏差。便携式且价格低廉的无标记运动捕捉系统的出现,可以通过客观的关节运动学分析来改进评估。本研究旨在评估各种机器学习模型,这些模型利用来自Kinect位置数据的运动学特征来对表演者的运动能力筛查(MCS)分数进行分类。一个Kinect V2传感器收集了31名身体活跃的男性在进行双侧深蹲、前弓步和单腿深蹲时的位置数据;并根据MCS标准对运动质量进行评分。从基于领域知识的运动学变量中提取并选择特征作为模型输入。然后进行多类逻辑回归(MLR),将关节运动学转化为MCS分数。在对基于Kinect的运动学变量开发的每个模型进行10倍交叉验证后,计算性能指标。分析表明,这些模型的灵敏度、特异性和准确率分别在0.66至0.89、0.58至0.86和0.74至0.85之间。总之,基于Kinect的自动运动质量评估是一种适用于运动质量评估的新颖且实用的方法。

相似文献

1
Automated classification of movement quality using the Microsoft Kinect V2 sensor.使用微软Kinect V2传感器对运动质量进行自动分类。
Comput Biol Med. 2020 Oct;125:104021. doi: 10.1016/j.compbiomed.2020.104021. Epub 2020 Sep 29.
2
The validity of the first and second generation Microsoft Kinect™ for identifying joint center locations during static postures.第一代和第二代Microsoft Kinect™在识别静态姿势时关节中心位置方面的有效性。
Appl Ergon. 2015 Jul;49:47-54. doi: 10.1016/j.apergo.2015.01.005. Epub 2015 Feb 17.
3
Kinect v2 tracked Body Joint Smoothing for Kinematic Analysis in Musculoskeletal Disorders.用于肌肉骨骼疾病运动学分析的Kinect v2跟踪身体关节平滑处理
Annu Int Conf IEEE Eng Med Biol Soc. 2020 Jul;2020:5769-5772. doi: 10.1109/EMBC44109.2020.9175492.
4
Gait assessment using the Microsoft Xbox One Kinect: Concurrent validity and inter-day reliability of spatiotemporal and kinematic variables.使用微软Xbox One Kinect进行步态评估:时空和运动学变量的同时效度及日间可靠性
J Biomech. 2015 Jul 16;48(10):2166-70. doi: 10.1016/j.jbiomech.2015.05.021. Epub 2015 May 28.
5
Digital data acquisition of shoulder range of motion and arm motion smoothness using Kinect v2.使用Kinect v2对肩部活动范围和手臂运动平滑度进行数字数据采集。
J Shoulder Elbow Surg. 2017 May;26(5):895-901. doi: 10.1016/j.jse.2016.10.026. Epub 2017 Jan 25.
6
Agreement between Azure Kinect and Marker-Based Motion Analysis during Functional Movements: A Feasibility Study.Azure Kinect 与基于标记的运动分析在功能运动中的一致性:一项可行性研究。
Sensors (Basel). 2022 Dec 14;22(24):9819. doi: 10.3390/s22249819.
7
Validation of the Microsoft Kinect® camera system for measurement of lower extremity jump landing and squatting kinematics.用于测量下肢跳跃着陆和下蹲运动学的微软Kinect®摄像系统的验证
Sports Biomech. 2016;15(1):89-102. doi: 10.1080/14763141.2015.1123766. Epub 2016 Feb 2.
8
Reliability and validity of the Kinect V2 for the assessment of lower extremity rehabilitation exercises.用于评估下肢康复训练的Kinect V2的可靠性和有效性。
Gait Posture. 2019 May;70:330-335. doi: 10.1016/j.gaitpost.2019.03.020. Epub 2019 Mar 26.
9
Validity of time series kinematical data as measured by a markerless motion capture system on a flatland for gait assessment.无标记运动捕捉系统在平地上测量的时间序列运动学数据用于步态评估的有效性。
J Biomech. 2018 Apr 11;71:281-285. doi: 10.1016/j.jbiomech.2018.01.035. Epub 2018 Feb 8.
10
Development and Validation of a Portable and Inexpensive Tool to Measure the Drop Vertical Jump Using the Microsoft Kinect V2.一种使用微软Kinect V2测量垂直纵跳的便携式低成本工具的开发与验证
Sports Health. 2017 Nov/Dec;9(6):537-544. doi: 10.1177/1941738117726323. Epub 2017 Aug 28.

引用本文的文献

1
Development and validation of machine learning models for classifying cancer-related sarcopenia using Kinect-based mixed-reality exercises in breast cancer survivors.利用基于Kinect的混合现实运动对乳腺癌幸存者的癌症相关肌肉减少症进行分类的机器学习模型的开发与验证
Transl Cancer Res. 2025 Jul 30;14(7):4208-4218. doi: 10.21037/tcr-2024-2337. Epub 2025 Jul 22.
2
Azure Kinect performance evaluation for human motion and upper limb biomechanical analysis.用于人体运动和上肢生物力学分析的Azure Kinect性能评估
Heliyon. 2023 Nov 4;9(11):e21606. doi: 10.1016/j.heliyon.2023.e21606. eCollection 2023 Nov.
3
Technical aspects of virtual augmented reality-based rehabilitation systems for musculoskeletal disorders of the lower limbs: a systematic review.
基于虚拟现实的下肢运动系统疾病康复系统的技术方面:系统评价。
BMC Musculoskelet Disord. 2023 Jan 3;24(1):4. doi: 10.1186/s12891-022-06062-6.
4
Human Movement Quality Assessment Using Sensor Technologies in Recreational and Professional Sports: A Scoping Review.基于传感器技术的运动质量评估在休闲和职业运动中的应用:范围综述。
Sensors (Basel). 2022 Jun 24;22(13):4786. doi: 10.3390/s22134786.