• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

自我中心视觉中的手分析:调查。

Analysis of the Hands in Egocentric Vision: A Survey.

出版信息

IEEE Trans Pattern Anal Mach Intell. 2023 Jun;45(6):6846-6866. doi: 10.1109/TPAMI.2020.2986648. Epub 2023 May 5.

DOI:10.1109/TPAMI.2020.2986648
PMID:32286958
Abstract

Egocentric vision (a.k.a. first-person vision-FPV) applications have thrived over the past few years, thanks to the availability of affordable wearable cameras and large annotated datasets. The position of the wearable camera (usually mounted on the head) allows recording exactly what the camera wearers have in front of them, in particular hands and manipulated objects. This intrinsic advantage enables the study of the hands from multiple perspectives: localizing hands and their parts within the images; understanding what actions and activities the hands are involved in; and developing human-computer interfaces that rely on hand gestures. In this survey, we review the literature that focuses on the hands using egocentric vision, categorizing the existing approaches into: localization (where are the hands or parts of them?); interpretation (what are the hands doing?); and application (e.g., systems that used egocentric hand cues for solving a specific problem). Moreover, a list of the most prominent datasets with hand-based annotations is provided.

摘要

自我中心视觉(也称为第一人称视觉-FPV)应用在过去几年中蓬勃发展,这要归功于价格实惠的可穿戴相机和大型标注数据集的可用性。可穿戴相机的位置(通常安装在头部)允许记录相机佩戴者面前的准确内容,特别是手和操作的对象。这种内在优势使我们能够从多个角度研究手:在图像中定位手及其部分;理解手参与的动作和活动;并开发依赖手势的人机界面。在本次调查中,我们回顾了使用自我中心视觉的文献,将现有的方法分为三类:定位(手或手的部分在哪里?);解释(手在做什么?);以及应用(例如,使用自我中心手线索解决特定问题的系统)。此外,还提供了带有手部标注的最突出数据集列表。

相似文献

1
Analysis of the Hands in Egocentric Vision: A Survey.自我中心视觉中的手分析:调查。
IEEE Trans Pattern Anal Mach Intell. 2023 Jun;45(6):6846-6866. doi: 10.1109/TPAMI.2020.2986648. Epub 2023 May 5.
2
Egocentric Action Recognition by Automatic Relation Modeling.通过自动关系建模实现自我中心动作识别
IEEE Trans Pattern Anal Mach Intell. 2023 Jan;45(1):489-507. doi: 10.1109/TPAMI.2022.3148790. Epub 2022 Dec 5.
3
A Survey on Hand Pose Estimation with Wearable Sensors and Computer-Vision-Based Methods.基于可穿戴传感器和计算机视觉的手势估计研究综述。
Sensors (Basel). 2020 Feb 16;20(4):1074. doi: 10.3390/s20041074.
4
Desktop Action Recognition From First-Person Point-of-View.基于第一人称视角的桌面行为识别。
IEEE Trans Cybern. 2019 May;49(5):1616-1628. doi: 10.1109/TCYB.2018.2806381. Epub 2018 Feb 27.
5
Viewpoint Integration for Hand-Based Recognition of Social Interactions from a First-Person View.基于手部的第一人称视角社交互动识别中的观点整合
Proc ACM Int Conf Multimodal Interact. 2015 Nov;2015:351-354. doi: 10.1145/2818346.2820771.
6
Egocentric video: a new tool for capturing hand use of individuals with spinal cord injury at home.自我中心视频:一种在家中捕捉脊髓损伤患者手部使用情况的新工具。
J Neuroeng Rehabil. 2019 Jul 5;16(1):83. doi: 10.1186/s12984-019-0557-1.
7
Identifying Hand Use and Hand Roles After Stroke Using Egocentric Video.基于自我中心视频识别脑卒中后手的使用和手的角色
IEEE J Transl Eng Health Med. 2021 Apr 9;9:2100510. doi: 10.1109/JTEHM.2021.3072347. eCollection 2021.
8
Exploring Architectural Details Through a Wearable Egocentric Vision Device.通过可穿戴自我中心视觉设备探索建筑细节。
Sensors (Basel). 2016 Feb 17;16(2):237. doi: 10.3390/s16020237.
9
Emerging Wearable Interfaces and Algorithms for Hand Gesture Recognition: A Survey.用于手势识别的新兴可穿戴接口与算法:一项综述。
IEEE Rev Biomed Eng. 2022;15:85-102. doi: 10.1109/RBME.2021.3078190. Epub 2022 Jan 20.
10
Hands-Free User Interface for VR Headsets Based on In Situ Facial Gesture Sensing.基于原位面部手势感应的 VR 头戴式设备免提用户界面。
Sensors (Basel). 2020 Dec 16;20(24):7206. doi: 10.3390/s20247206.

引用本文的文献

1
Video-Based Plastic Bag Grabbing Action Recognition: A New Video Dataset and a Comparative Study of Baseline Models.基于视频的塑料袋抓取动作识别:一个新的视频数据集及基线模型的比较研究
Sensors (Basel). 2025 Jan 4;25(1):255. doi: 10.3390/s25010255.
2
Designing an Egocentric Video-Based Dashboard to Report Hand Performance Measures for Outpatient Rehabilitation of Cervical Spinal Cord Injury.设计基于自我中心视频的仪表盘,报告颈椎脊髓损伤门诊康复的手部表现测量指标。
Top Spinal Cord Inj Rehabil. 2023 Fall;29(Suppl):75-87. doi: 10.46292/sci23-00015S. Epub 2023 Nov 17.
3
Recognizing hand use and hand role at home after stroke from egocentric video.
通过以自我为中心的视频识别中风后在家中的手部使用情况和手部功能。
PLOS Digit Health. 2023 Oct 11;2(10):e0000361. doi: 10.1371/journal.pdig.0000361. eCollection 2023 Oct.
4
YOLO Series for Human Hand Action Detection and Classification from Egocentric Videos.基于自拍摄视频的人体手部动作检测与分类的 YOLO 系列。
Sensors (Basel). 2023 Mar 20;23(6):3255. doi: 10.3390/s23063255.
5
Impacts of Image Obfuscation on Fine-grained Activity Recognition in Egocentric Video.图像模糊处理对自我中心视频中细粒度活动识别的影响
Proc IEEE Int Conf Pervasive Comput Commun Workshops. 2022 Mar;2022:341-346. doi: 10.1109/percomworkshops53856.2022.9767447. Epub 2022 May 6.
6
Group Emotion Detection Based on Social Robot Perception.基于社交机器人感知的群体情绪检测。
Sensors (Basel). 2022 May 14;22(10):3749. doi: 10.3390/s22103749.
7
Perspectives and recommendations of individuals with tetraplegia regarding wearable cameras for monitoring hand function at home: Insights from a community-based study.四肢瘫痪者对可穿戴相机在家中监测手部功能的观点和建议:基于社区的研究的见解。
J Spinal Cord Med. 2021;44(sup1):S173-S184. doi: 10.1080/10790268.2021.1920787. Epub 2021 May 7.
8
Identifying Hand Use and Hand Roles After Stroke Using Egocentric Video.基于自我中心视频识别脑卒中后手的使用和手的角色
IEEE J Transl Eng Health Med. 2021 Apr 9;9:2100510. doi: 10.1109/JTEHM.2021.3072347. eCollection 2021.
9
A Gesture Elicitation Study of Nose-Based Gestures.基于鼻子的手势的姿态诱发研究。
Sensors (Basel). 2020 Dec 11;20(24):7118. doi: 10.3390/s20247118.