Le2i, FRE CNRS 2005, Univ. Bourgogne Franche-Comté, France.
Kotelnikov Institute of Radio Engineering and Electronics of RAS, Moscow 125009, Russia.
Artif Intell Med. 2019 Mar;94:54-66. doi: 10.1016/j.artmed.2018.12.007. Epub 2019 Jan 11.
Computer vision-based clinical gait analysis is the subject of permanent research. However, there are very few datasets publicly available; hence the comparison of existing methods between each other is not straightforward. Even if the test data are in an open access, existing databases contain very few test subjects and single modality measurements, which limit their usage. The contributions of this paper are three-fold. First, we propose a new open-access multi-modal database acquired with the Kinect v.2 camera for the task of gait analysis. Second, we adapt to use the skeleton joint orientation data to calculate kinematic gait parameters to match golden-standard MOCAP systems. We propose a new set of features based on 3D low-limbs flexion dynamics to analyze the symmetry of a gait. Third, we design a Long-Short Term Memory (LSTM) ensemble model to create an unsupervised gait classification tool. The results show that joint orientation data provided by Kinect can be successfully used in an inexpensive clinical gait monitoring system, with the results moderately better than reported state-of-the-art for three normal/pathological gait classes.
基于计算机视觉的临床步态分析是一个持续研究的课题。然而,公开可用的数据集非常少;因此,现有的方法之间的相互比较并不直接。即使测试数据是开放获取的,现有的数据库也只包含很少的测试对象和单一模式的测量,这限制了它们的使用。本文的贡献有三点。首先,我们提出了一个新的开放访问的多模态数据库,该数据库是使用 Kinect v.2 相机获取的,用于步态分析任务。其次,我们采用骨骼关节方向数据来计算运动学步态参数,以匹配黄金标准的 MOCAP 系统。我们提出了一组基于 3D 下肢弯曲动力学的新特征,用于分析步态的对称性。第三,我们设计了一个长短期记忆(LSTM)集成模型,以创建一个无监督的步态分类工具。结果表明,Kinect 提供的关节方向数据可以成功地用于廉价的临床步态监测系统,其结果在三个正常/病理步态类别中略优于报告的最新技术水平。