Usami Takuya, Kisohara Masaya, Nishida Kazuki, Koboyashi Daishiro, Ida Ruido, Matsubara Kohki, Tokuda Haruhiko, Suzuki Nobuyuki, Murakami Hideki, Kuroyanagi Gen
Department of Orthopedic Surgery, Nagoya City University Graduate School of Medical Sciences, Nagoya, JPN.
Department of Radiology, Nagoya City University Graduate School of Medical Sciences, Nagoya, JPN.
Cureus. 2025 Jul 13;17(7):e87837. doi: 10.7759/cureus.87837. eCollection 2025 Jul.
Motion capture is widely used to analyze human gait and enables measurement of various biomechanical parameters. However, conventional infrared-based motion-capture systems are expensive and require a large amount of space, making them difficult to implement in many facilities. Recently, artificial intelligence (AI) has been applied in numerous medical fields, including gait analysis. This study aimed to evaluate the effectiveness of an AI-based motion capture system using a single smartphone camera compared to a conventional infrared-based motion capture system.
Twenty-two straight walks of healthy volunteers were simultaneously captured using a smartphone (iPhone X, Apple Inc., Cupertino, CA) placed on the right side of the participants (Group AI) and an infrared-based motion capture system (Group M). In Group AI, gait videos were evaluated by the Sportip Motion 3D AI-based motion capture system (Sportip Inc., Tokyo, Japan). The same walking cycles were analyzed for both methods. Gait parameters, including gait velocity, gait cycle time, step length, and flexion angles of the hip and knee joints, were compared between the two groups.
The shapes of the hip and knee flexion angle graphs in Group AI were similar to those in Group M. Variables, such as gait velocity, bilateral step length, and maximum flexion angle of the hip and knee joints, showed high accuracy. Most variables showed high correlation coefficients (gait velocity, r = 0.94; right and left step lengths, r = 0.91 and 0.93; right and left maximum flexion angle of the hip joint, r = 0.87 and 0.71; knee joint, r = 0.84 and 0.93; right and left minimum flexion angles of the hip joint, r = 0.73 and 0.75). However, low correlation coefficients were observed in gait cycle time (r = 0.68) and minimum knee flexion angle (right and left, r = 0.30 and 0.47).
Our findings suggest that an AI-based motion capture system using a single smartphone camera may provide reliable gait parameters for certain applications.
动作捕捉被广泛用于分析人类步态,并能够测量各种生物力学参数。然而,传统的基于红外线的动作捕捉系统价格昂贵且需要大量空间,这使得它们在许多场所难以实施。近年来,人工智能(AI)已被应用于包括步态分析在内的众多医学领域。本研究旨在评估与传统的基于红外线的动作捕捉系统相比,使用单个智能手机摄像头的基于人工智能的动作捕捉系统的有效性。
使用放置在参与者右侧的智能手机(苹果公司的iPhone X,位于加利福尼亚州库比蒂诺)和基于红外线的动作捕捉系统,同时捕捉22名健康志愿者的直线行走过程(分别为人工智能组和动作捕捉组)。在人工智能组中,步态视频由基于Sportip Motion 3D人工智能的动作捕捉系统(日本东京的Sportip公司)进行评估。两种方法均分析相同的步行周期。比较两组之间的步态参数,包括步态速度、步态周期时间、步长以及髋关节和膝关节的屈曲角度。
人工智能组中髋关节和膝关节屈曲角度图的形状与动作捕捉组相似。诸如步态速度、双侧步长以及髋关节和膝关节的最大屈曲角度等变量显示出较高的准确性。大多数变量显示出较高的相关系数(步态速度,r = 0.94;左右步长,r = 0.91和0.93;髋关节左右最大屈曲角度,r = 0.87和0.71;膝关节,r = 0.84和0.93;髋关节左右最小屈曲角度,r = 0.73和0.75)。然而,在步态周期时间(r = 0.68)和膝关节最小屈曲角度(左右,r = 0.30和0.47)方面观察到较低的相关系数。
我们的研究结果表明,使用单个智能手机摄像头的基于人工智能的动作捕捉系统可能为某些应用提供可靠的步态参数。