• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

内摄视角视频中的交互检测:一种新的上肢功能结局测量方法。

Interaction Detection in Egocentric Video: Toward a Novel Outcome Measure for Upper Extremity Function.

出版信息

IEEE J Biomed Health Inform. 2018 Mar;22(2):561-569. doi: 10.1109/JBHI.2016.2636748. Epub 2016 Dec 7.

DOI:10.1109/JBHI.2016.2636748
PMID:28114045
Abstract

In order to develop effective interventions for restoring upper extremity function after cervical spinal cord injury, tools are needed to accurately measure hand function throughout the rehabilitation process. However, there is currently no suitable method to collect information about hand function in the community, when patients are not under direct observation of a clinician. We propose a wearable system that can monitor functional hand use using computer vision techniques applied to egocentric camera videos. To this end, in this study we demonstrate the feasibility of detecting interactions of the hand with objects in the environment from egocentric video. The system consists of a preprocessing step where the hand is segmented out from the background. The algorithm then extracts features associated with hand-object interactions. This includes comparing motion cues in the region near the hand (i.e., where the object is most likely to be located) to the motion of the hand itself, as well as to the motion of the background. Features representing hand shape are also extracted. The features serve as inputs to a random forest classifier, which was tested with a dataset of 14 activities of daily living as well as noninteractive tasks in five environment (total video duration of 44.16 min). The average F-score for the classifier was 0.85 for leave-one-activity out in our dataset set and 0.91 for a publicly available set (1.72 min) when filtered with a moving average. These results suggest that using egocentric video to monitor functional hand use at home is feasible.

摘要

为了开发有效的干预措施来恢复颈椎脊髓损伤后的上肢功能,需要有工具来在整个康复过程中准确地测量手部功能。然而,目前还没有合适的方法在社区中收集有关手部功能的信息,当患者不在临床医生的直接观察下时。我们提出了一种可穿戴系统,该系统可以使用计算机视觉技术从自我中心摄像机视频中监测功能手部使用情况。为此,在本研究中,我们证明了从自我中心视频中检测手与环境中物体相互作用的可行性。该系统包括预处理步骤,其中从背景中分割出手。然后,算法提取与手-物体相互作用相关的特征。这包括将手部附近区域(即物体最可能位于的区域)的运动线索与手部本身的运动以及背景的运动进行比较。还提取了表示手部形状的特征。这些特征作为随机森林分类器的输入,该分类器使用了 14 项日常生活活动以及五个环境中的非交互任务的数据集进行了测试(总视频时长为 44.16 分钟)。在我们的数据集设置中,分类器的平均 F 分数为 0.85,在经过移动平均值过滤后的公开数据集(1.72 分钟)中为 0.91。这些结果表明,使用自我中心视频在家中监测手部功能是可行的。

相似文献

1
Interaction Detection in Egocentric Video: Toward a Novel Outcome Measure for Upper Extremity Function.内摄视角视频中的交互检测:一种新的上肢功能结局测量方法。
IEEE J Biomed Health Inform. 2018 Mar;22(2):561-569. doi: 10.1109/JBHI.2016.2636748. Epub 2016 Dec 7.
2
Egocentric video: a new tool for capturing hand use of individuals with spinal cord injury at home.自我中心视频:一种在家中捕捉脊髓损伤患者手部使用情况的新工具。
J Neuroeng Rehabil. 2019 Jul 5;16(1):83. doi: 10.1186/s12984-019-0557-1.
3
Views of individuals with spinal cord injury on the use of wearable cameras to monitor upper limb function in the home and community.脊髓损伤患者对使用可穿戴摄像头在家中和社区监测上肢功能的看法。
J Spinal Cord Med. 2017 Nov;40(6):706-714. doi: 10.1080/10790268.2017.1349856. Epub 2017 Jul 24.
4
Automated Hand Prehension Assessment From Egocentric Video After Spinal Cord Injury.脊髓损伤后基于自我中心视频的自动手部抓握评估。
IEEE Trans Neural Syst Rehabil Eng. 2024;32:2864-2872. doi: 10.1109/TNSRE.2024.3438436. Epub 2024 Aug 12.
5
Generalizability of Hand-Object Interaction Detection in Egocentric Video across Populations with Hand Impairment.以自我为中心的视频中手部与物体交互检测在手部受损人群中的可推广性。
Annu Int Conf IEEE Eng Med Biol Soc. 2020 Jul;2020:3228-3231. doi: 10.1109/EMBC44109.2020.9176154.
6
Measuring Hand Use in the Home after Cervical Spinal Cord Injury Using Egocentric Video.使用自我中心视频测量颈脊髓损伤后家庭中的手使用情况。
J Neurotrauma. 2022 Dec;39(23-24):1697-1707. doi: 10.1089/neu.2022.0156. Epub 2022 Jul 21.
7
An Effective and Efficient Method for Detecting Hands in Egocentric Videos for Rehabilitation Applications.用于康复应用的自拍摄视频中手部检测的有效和高效方法。
IEEE Trans Neural Syst Rehabil Eng. 2020 Mar;28(3):748-755. doi: 10.1109/TNSRE.2020.2968912. Epub 2020 Jan 23.
8
A wearable vision-based system for detecting hand-object interactions in individuals with cervical spinal cord injury: First results in the home environment.一种用于检测颈脊髓损伤个体手部与物体交互的可穿戴视觉系统:在家居环境中的初步结果。
Annu Int Conf IEEE Eng Med Biol Soc. 2020 Jul;2020:2159-2162. doi: 10.1109/EMBC44109.2020.9176274.
9
Tenodesis Grasp Detection in Egocentric Video.基于自我中心视频的经皮穿刺把持检测
IEEE J Biomed Health Inform. 2021 May;25(5):1463-1470. doi: 10.1109/JBHI.2020.3003643. Epub 2021 May 11.
10
Capturing hand use of individuals with spinal cord injury at home using egocentric video: a feasibility study.使用自我中心视频在家中捕获脊髓损伤个体的手部使用:一项可行性研究。
Spinal Cord Ser Cases. 2021 Mar 5;7(1):17. doi: 10.1038/s41394-021-00382-w.

引用本文的文献

1
Designing an Egocentric Video-Based Dashboard to Report Hand Performance Measures for Outpatient Rehabilitation of Cervical Spinal Cord Injury.设计基于自我中心视频的仪表盘,报告颈椎脊髓损伤门诊康复的手部表现测量指标。
Top Spinal Cord Inj Rehabil. 2023 Fall;29(Suppl):75-87. doi: 10.46292/sci23-00015S. Epub 2023 Nov 17.
2
Perspectives and recommendations of individuals with tetraplegia regarding wearable cameras for monitoring hand function at home: Insights from a community-based study.四肢瘫痪者对可穿戴相机在家中监测手部功能的观点和建议:基于社区的研究的见解。
J Spinal Cord Med. 2021;44(sup1):S173-S184. doi: 10.1080/10790268.2021.1920787. Epub 2021 May 7.
3
Capturing hand use of individuals with spinal cord injury at home using egocentric video: a feasibility study.
使用自我中心视频在家中捕获脊髓损伤个体的手部使用:一项可行性研究。
Spinal Cord Ser Cases. 2021 Mar 5;7(1):17. doi: 10.1038/s41394-021-00382-w.
4
Egocentric video: a new tool for capturing hand use of individuals with spinal cord injury at home.自我中心视频:一种在家中捕捉脊髓损伤患者手部使用情况的新工具。
J Neuroeng Rehabil. 2019 Jul 5;16(1):83. doi: 10.1186/s12984-019-0557-1.