Suppr超能文献

基于 OpenPose 和手持智能手机视频的爱丁堡视觉步态评分(EVGS)的自动化实现。

Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video.

机构信息

Department of Mechanical Engineering, University of Ottawa, Ottawa, ON K1N 6N5, Canada.

The Ottawa Hospital Research Institute, Ottawa, ON K1H 8M2, Canada.

出版信息

Sensors (Basel). 2023 May 17;23(10):4839. doi: 10.3390/s23104839.

Abstract

Recent advancements in computing and artificial intelligence (AI) make it possible to quantitatively evaluate human movement using digital video, thereby opening the possibility of more accessible gait analysis. The Edinburgh Visual Gait Score (EVGS) is an effective tool for observational gait analysis, but human scoring of videos can take over 20 min and requires experienced observers. This research developed an algorithmic implementation of the EVGS from handheld smartphone video to enable automatic scoring. Participant walking was video recorded at 60 Hz using a smartphone, and body keypoints were identified using the OpenPose BODY25 pose estimation model. An algorithm was developed to identify foot events and strides, and EVGS parameters were determined at relevant gait events. Stride detection was accurate within two to five frames. The level of agreement between the algorithmic and human reviewer EVGS results was strong for 14 of 17 parameters, and the algorithmic EVGS results were highly correlated (r > 0.80, "r" represents the Pearson correlation coefficient) to the ground truth values for 8 of the 17 parameters. This approach could make gait analysis more accessible and cost-effective, particularly in areas without gait assessment expertise. These findings pave the way for future studies to explore the use of smartphone video and AI algorithms in remote gait analysis.

摘要

近年来,计算和人工智能(AI)领域的进展使得使用数字视频对人类运动进行定量评估成为可能,从而为更便捷的步态分析开辟了可能性。爱丁堡视觉步态评分(EVGS)是一种用于观察性步态分析的有效工具,但人工对视频进行评分可能需要超过 20 分钟的时间,并且需要有经验的观察者。本研究开发了一种从手持智能手机视频自动进行 EVGS 评分的算法实现方法。参与者以 60Hz 的频率使用智能手机进行步行录像,并使用 OpenPose BODY25 姿势估计模型识别身体关键点。开发了一种算法来识别足事件和步幅,并在相关步态事件确定 EVGS 参数。步幅检测的准确率在两到五个帧内。算法和人工审阅者 EVGS 结果在 17 个参数中的 14 个参数上具有较强的一致性,并且算法 EVGS 结果与 17 个参数中的 8 个参数的真实值高度相关(r>0.80,“r”代表皮尔逊相关系数)。这种方法可以使步态分析更加便捷和经济高效,特别是在没有步态评估专业知识的地区。这些发现为未来使用智能手机视频和人工智能算法进行远程步态分析的研究铺平了道路。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3da3/10220686/06133eec664e/sensors-23-04839-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验