Cheng Qiushuo, Morgan Catherine, Sikdar Arindam, Masullo Alessandro, Whone Alan, Mirmehdi Majid
Faculty of Engineering, University of Bristol, UK.
Translational Health Sciences, University of Bristol, UK; North Bristol NHS Trust, Southmead Hospital, Bristol, UK.
Artif Intell Med. 2025 Sep;167:103194. doi: 10.1016/j.artmed.2025.103194. Epub 2025 Jun 18.
People with Parkinson's Disease (PD) often experience progressively worsening gait, including changes in how they turn around, as the disease progresses. Existing clinical rating tools are not capable of capturing hour-by-hour variations of PD symptoms, as they are confined to brief assessments within clinic settings, leaving gait performance outside these controlled environments unaccounted for. Measuring turning angles continuously and passively is a component step towards using gait characteristics as sensitive indicators of disease progression in PD. This paper presents a deep learning-based approach to automatically quantify turning angles by extracting 3D skeletons from videos and calculating the rotation of hip and knee joints. We utilise advanced human pose estimation models, Fastpose and Strided Transformer, on a total of 1386 turning video clips from 24 subjects (12 people with PD and 12 healthy control volunteers), trimmed from a PD dataset of unscripted free-living videos in a home-like setting (Turn-REMAP). We also curate a turning video dataset, Turn-H3.6M, from the public Human3.6M human pose benchmark with 3D groundtruth, to further validate our method. Previous gait research has primarily taken place in clinics or laboratories evaluating scripted gait outcomes, but this work focuses on free-living home settings where complexities exist, such as baggy clothing and poor lighting. Due to difficulties in obtaining accurate groundtruth data in a free-living setting, we quantise the angle into the nearest bin 45° based on the manual labelling of expert clinicians. Our method achieves a turning calculation accuracy of 41.6%, a Mean Absolute Error (MAE) of 34.7°, and a weighted precision (WPrec) of 68.3% for Turn-REMAP. On Turn-H3.6M, it achieves an accuracy of 73.5%, an MAE of 18.5°, and a WPrec of 86.2%. This is the first work to explore the use of single monocular camera data to quantify turns by PD patients in a home setting. All data and models are publicly available, providing a baseline for turning parameter measurement to promote future PD gait research.
帕金森病(PD)患者在疾病进展过程中,通常会经历步态逐渐恶化,包括转身方式的改变。现有的临床评分工具无法捕捉PD症状的逐小时变化,因为它们仅限于诊所环境中的简短评估,而未考虑这些受控环境之外的步态表现。连续被动地测量转身角度是将步态特征用作PD疾病进展敏感指标的一个组成步骤。本文提出了一种基于深度学习的方法,通过从视频中提取3D骨架并计算髋关节和膝关节的旋转来自动量化转身角度。我们在总共1386个来自24名受试者(12名PD患者和12名健康对照志愿者)的转身视频片段上使用了先进的人体姿态估计模型Fastpose和Strided Transformer,这些片段是从一个类似家庭环境的无脚本自由生活视频的PD数据集中剪辑出来的(Turn-REMAP)。我们还从具有3D真实值的公共Human3.6M人体姿态基准中精心挑选了一个转身视频数据集Turn-H3.6M,以进一步验证我们的方法。以前的步态研究主要在诊所或实验室进行,评估脚本化的步态结果,但这项工作关注的是存在诸如宽松衣物和光线不足等复杂性的自由生活家庭环境。由于在自由生活环境中获取准确的真实数据存在困难,我们根据专家临床医生的手动标注将角度量化到最接近的45°区间。对于Turn-REMAP,我们的方法实现了41.6%的转身计算准确率、34.7°的平均绝对误差(MAE)和68.3%的加权精度(WPrec)。在Turn-H3.6M上,它实现了73.5%的准确率、18.5°的MAE和86.2%的WPrec。这是第一项探索使用单目相机数据在家庭环境中量化PD患者转身情况的工作。所有数据和模型都可公开获取,为转身参数测量提供了一个基线,以促进未来的PD步态研究。