Institute of Biomaterials & Biomedical Engineering, University of Toronto, Toronto, Ontario, Canada.
KITE, Toronto Rehabilitation Institute, University Health Network, Toronto, Ontario, Canada.
J Neuroeng Rehabil. 2019 Jul 5;16(1):83. doi: 10.1186/s12984-019-0557-1.
Current upper extremity outcome measures for persons with cervical spinal cord injury (cSCI) lack the ability to directly collect quantitative information in home and community environments. A wearable first-person (egocentric) camera system is presented that aims to monitor functional hand use outside of clinical settings.
The system is based on computer vision algorithms that detect the hand, segment the hand outline, distinguish the user's left or right hand, and detect functional interactions of the hand with objects during activities of daily living. The algorithm was evaluated using egocentric video recordings from 9 participants with cSCI, obtained in a home simulation laboratory. The system produces a binary hand-object interaction decision for each video frame, based on features reflecting motion cues of the hand, hand shape and colour characteristics of the scene.
The output from the algorithm was compared with a manual labelling of the video, yielding F1-scores of 0.74 ± 0.15 for the left hand and 0.73 ± 0.15 for the right hand. From the resulting frame-by-frame binary data, functional hand use measures were extracted: the amount of total interaction as a percentage of testing time, the average duration of interactions in seconds, and the number of interactions per hour. Moderate and significant correlations were found when comparing these output measures to the results of the manual labelling, with ρ = 0.40, 0.54 and 0.55 respectively.
These results demonstrate the potential of a wearable egocentric camera for capturing quantitative measures of hand use at home.
目前针对颈脊髓损伤(cSCI)患者的上肢结局评估方法缺乏在家庭和社区环境中直接收集定量信息的能力。本文提出了一种可穿戴式第一人称(自我中心)摄像系统,旨在监测临床环境之外的功能性手部使用情况。
该系统基于计算机视觉算法,可检测到手部、对手部轮廓进行分割、区分使用者的左手或右手,并在日常生活活动中检测手部与物体的功能性相互作用。该算法使用 9 名 cSCI 参与者在家庭模拟实验室中获得的自我中心视频记录进行了评估。该系统会根据反映手部运动线索、手部形状和场景颜色特征的特征,为每一视频帧生成一个手部-物体相互作用的二进制决策。
将算法的输出与视频的手动标记进行了比较,得出左手的 F1 得分为 0.74±0.15,右手的 F1 得分为 0.73±0.15。从生成的逐帧二进制数据中提取了功能性手部使用的度量指标:总交互量占测试时间的百分比、交互持续时间的平均值(以秒为单位)以及每小时的交互次数。当将这些输出指标与手动标记的结果进行比较时,发现具有中等且显著的相关性,ρ 值分别为 0.40、0.54 和 0.55。
这些结果表明,可穿戴式自我中心摄像机具有捕捉家庭中手部使用定量指标的潜力。