• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于模型的系统,使用简单的数据手套和深度摄像机实现实时关节手跟踪。

A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera.

机构信息

Beijing Key Laboratory of Network System Architecture and Convergence, School of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, China.

Beijing Laboratory of Advanced Information Networks, Beijing 100876, China.

出版信息

Sensors (Basel). 2019 Oct 28;19(21):4680. doi: 10.3390/s19214680.

DOI:10.3390/s19214680
PMID:31661877
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6865016/
Abstract

Tracking detailed hand motion is a fundamental research topic in the area of human-computer interaction (HCI) and has been widely studied for decades. Existing solutions with single-model inputs either require tedious calibration, are expensive or lack sufficient robustness and accuracy due to occlusions. In this study, we present a real-time system to reconstruct the exact hand motion by iteratively fitting a triangular mesh model to the absolute measurement of hand from a depth camera under the robust restriction of a simple data glove. We redefine and simplify the function of the data glove to lighten its limitations, i.e., tedious calibration, cumbersome equipment, and hampering movement and keep our system lightweight. For accurate hand tracking, we introduce a new set of degrees of freedom (DoFs), a shape adjustment term for personalizing the triangular mesh model, and an adaptive collision term to prevent self-intersection. For efficiency, we extract a strong pose-space prior to the data glove to narrow the pose searching space. We also present a simplified approach for computing tracking correspondences without the loss of accuracy to reduce computation cost. Quantitative experiments show the comparable or increased accuracy of our system over the state-of-the-art with about 40% improvement in robustness. Besides, our system runs independent of Graphic Processing Unit (GPU) and reaches 40 frames per second (FPS) at about 25% Central Processing Unit (CPU) usage.

摘要

追踪详细的手部动作是人机交互(HCI)领域的一个基础研究课题,已经被广泛研究了几十年。现有的单模型输入解决方案要么需要繁琐的校准,要么由于遮挡而价格昂贵或缺乏足够的鲁棒性和准确性。在本研究中,我们提出了一种实时系统,通过迭代拟合三角网格模型来重建精确的手部运动,该模型基于深度相机对手部的绝对测量值,同时受到简单数据手套的稳健限制。我们重新定义和简化了数据手套的功能,以减轻其局限性,例如繁琐的校准、繁琐的设备以及妨碍运动,同时保持系统的轻量化。为了实现准确的手部跟踪,我们引入了一组新的自由度(DoFs),即个性化三角网格模型的形状调整项,以及自适应碰撞项,以防止自交。为了提高效率,我们从数据手套中提取了一个强大的姿态空间先验,以缩小姿态搜索空间。我们还提出了一种简化的方法来计算跟踪对应关系,而不会降低准确性,从而降低计算成本。定量实验表明,我们的系统在与最先进的系统相比具有相当或更高的准确性,同时鲁棒性提高了约 40%。此外,我们的系统独立于图形处理单元(GPU)运行,在大约 25%的中央处理器(CPU)使用率下达到 40 帧每秒(FPS)。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/2b297ba111e0/sensors-19-04680-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/5b66ece6abeb/sensors-19-04680-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/6797be42eca6/sensors-19-04680-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/1e74ca7097ee/sensors-19-04680-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/d9c401e4cea8/sensors-19-04680-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/05a147fe07e4/sensors-19-04680-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/b875b4149290/sensors-19-04680-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/fe325451951b/sensors-19-04680-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/fe13f8077089/sensors-19-04680-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/38c435b905b7/sensors-19-04680-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/c8bd12eda12f/sensors-19-04680-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/54f429f00378/sensors-19-04680-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/aa966b9a7a4f/sensors-19-04680-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/a21b059c8f28/sensors-19-04680-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/819b31f4ed2d/sensors-19-04680-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/fa72fb79c509/sensors-19-04680-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/7506f26f6912/sensors-19-04680-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/75080f1a5a34/sensors-19-04680-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/8ec78e312371/sensors-19-04680-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/3a3c8556b487/sensors-19-04680-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/f49593245d90/sensors-19-04680-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/9b6ec2200c99/sensors-19-04680-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/b9fb8f1d9ee7/sensors-19-04680-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/d9c34689e6e0/sensors-19-04680-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/440735751a5a/sensors-19-04680-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/40fc6dc5d248/sensors-19-04680-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/97eeb7a7aef6/sensors-19-04680-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/2b297ba111e0/sensors-19-04680-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/5b66ece6abeb/sensors-19-04680-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/6797be42eca6/sensors-19-04680-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/1e74ca7097ee/sensors-19-04680-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/d9c401e4cea8/sensors-19-04680-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/05a147fe07e4/sensors-19-04680-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/b875b4149290/sensors-19-04680-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/fe325451951b/sensors-19-04680-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/fe13f8077089/sensors-19-04680-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/38c435b905b7/sensors-19-04680-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/c8bd12eda12f/sensors-19-04680-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/54f429f00378/sensors-19-04680-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/aa966b9a7a4f/sensors-19-04680-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/a21b059c8f28/sensors-19-04680-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/819b31f4ed2d/sensors-19-04680-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/fa72fb79c509/sensors-19-04680-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/7506f26f6912/sensors-19-04680-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/75080f1a5a34/sensors-19-04680-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/8ec78e312371/sensors-19-04680-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/3a3c8556b487/sensors-19-04680-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/f49593245d90/sensors-19-04680-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/9b6ec2200c99/sensors-19-04680-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/b9fb8f1d9ee7/sensors-19-04680-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/d9c34689e6e0/sensors-19-04680-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/440735751a5a/sensors-19-04680-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/40fc6dc5d248/sensors-19-04680-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/97eeb7a7aef6/sensors-19-04680-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7362/6865016/2b297ba111e0/sensors-19-04680-g028.jpg

相似文献

1
A Model-Based System for Real-Time Articulated Hand Tracking Using a Simple Data Glove and a Depth Camera.基于模型的系统,使用简单的数据手套和深度摄像机实现实时关节手跟踪。
Sensors (Basel). 2019 Oct 28;19(21):4680. doi: 10.3390/s19214680.
2
Real-Time Simultaneous Pose and Shape Estimation for Articulated Objects Using a Single Depth Camera.使用单目深度相机实时估计关节物体的姿态和形状。
IEEE Trans Pattern Anal Mach Intell. 2016 Aug;38(8):1517-32. doi: 10.1109/TPAMI.2016.2557783. Epub 2016 Apr 21.
3
Visual-inertial hand motion tracking with robustness against occlusion, interference, and contact.具有抗遮挡、抗干扰和抗接触能力的视觉惯性手运动跟踪。
Sci Robot. 2021 Sep 29;6(58):eabe1315. doi: 10.1126/scirobotics.abe1315.
4
Overall design and implementation of the virtual glove.虚拟手套的总体设计与实现。
Comput Biol Med. 2013 Nov;43(11):1927-40. doi: 10.1016/j.compbiomed.2013.08.026. Epub 2013 Sep 25.
5
Measurements by A LEAP-Based Virtual Glove for the Hand Rehabilitation.基于 LEAP 的虚拟手套对手部康复的测量。
Sensors (Basel). 2018 Mar 10;18(3):834. doi: 10.3390/s18030834.
6
An instrumented glove for monitoring hand function.一种用于监测手部功能的仪器手套。
Rev Sci Instrum. 2018 Oct;89(10):105001. doi: 10.1063/1.5038601.
7
SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality.基于 SLAM 的单目微创手术中密集表面重建及其在增强现实中的应用。
Comput Methods Programs Biomed. 2018 May;158:135-146. doi: 10.1016/j.cmpb.2018.02.006. Epub 2018 Feb 8.
8
Temporally guided articulated hand pose tracking in surgical videos.手术视频中基于时间引导的 articulated hand pose 跟踪。
Int J Comput Assist Radiol Surg. 2023 Jan;18(1):117-125. doi: 10.1007/s11548-022-02761-6. Epub 2022 Oct 3.
9
Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements.通过5DT数据手套与灵活VR Kinect相机测量的数据融合实现稳健的手部运动跟踪。
Sensors (Basel). 2015 Dec 15;15(12):31644-71. doi: 10.3390/s151229868.
10
3D Hand Tracking in the Presence of Excessive Motion Blur.存在过度运动模糊情况下的三维手部跟踪
IEEE Trans Vis Comput Graph. 2020 May;26(5):1891-1901. doi: 10.1109/TVCG.2020.2973057. Epub 2020 Feb 13.

引用本文的文献

1
Single-camera motion capture of finger joint mobility as a digital biomarker for disease activity in rheumatoid arthritis.作为类风湿性关节炎疾病活动度数字生物标志物的手指关节活动度单相机运动捕捉
Rheumatol Adv Pract. 2025 Apr 18;9(2):rkae143. doi: 10.1093/rap/rkae143. eCollection 2025.
2
British Sign Language Recognition via Late Fusion of Computer Vision and Leap Motion with Transfer Learning to American Sign Language.基于计算机视觉和 Leap Motion 的迁移学习的英国手语识别与美国手语的融合
Sensors (Basel). 2020 Sep 9;20(18):5151. doi: 10.3390/s20185151.
3
Device Development for Detecting Thumb Opposition Impairment Using Carbon Nanotube-Based Strain Sensors.

本文引用的文献

1
Generalized Feedback Loop for Joint Hand-Object Pose Estimation.通用的手-物联合位姿估计反馈环。
IEEE Trans Pattern Anal Mach Intell. 2020 Aug;42(8):1898-1912. doi: 10.1109/TPAMI.2019.2907951. Epub 2019 Mar 27.
2
A New Multi-Sensor Fusion Scheme to Improve the Accuracy of Knee Flexion Kinematics for Functional Rehabilitation Movements.一种用于提高功能性康复运动中膝关节屈曲运动学准确性的新型多传感器融合方案。
Sensors (Basel). 2016 Nov 15;16(11):1914. doi: 10.3390/s16111914.
3
Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements.
基于碳纳米管应变传感器的检测拇指对掌功能障碍的装置研制。
Sensors (Basel). 2020 Jul 18;20(14):3998. doi: 10.3390/s20143998.
通过5DT数据手套与灵活VR Kinect相机测量的数据融合实现稳健的手部运动跟踪。
Sensors (Basel). 2015 Dec 15;15(12):31644-71. doi: 10.3390/s151229868.