• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

增强型手套式医学应用二维手姿估计:初步模型。

Enhanced 2D Hand Pose Estimation for Gloved Medical Applications: A Preliminary Model.

机构信息

Department of Exercise and Sport Science, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA.

Human Movement Science Curriculum, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA.

出版信息

Sensors (Basel). 2024 Sep 17;24(18):6005. doi: 10.3390/s24186005.

DOI:10.3390/s24186005
PMID:39338750
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11435464/
Abstract

(1) Background: As digital health technology evolves, the role of accurate medical-gloved hand tracking is becoming more important for the assessment and training of practitioners to reduce procedural errors in clinical settings. (2) Method: This study utilized computer vision for hand pose estimation to model skeletal hand movements during in situ aseptic drug compounding procedures. High-definition video cameras recorded hand movements while practitioners wore medical gloves of different colors. Hand poses were manually annotated, and machine learning models were developed and trained using the DeepLabCut interface via an 80/20 training/testing split. (3) Results: The developed model achieved an average root mean square error (RMSE) of 5.89 pixels across the training data set and 10.06 pixels across the test set. When excluding keypoints with a confidence value below 60%, the test set RMSE improved to 7.48 pixels, reflecting high accuracy in hand pose tracking. (4) Conclusions: The developed hand pose estimation model effectively tracks hand movements across both controlled and in situ drug compounding contexts, offering a first-of-its-kind medical glove hand tracking method. This model holds potential for enhancing clinical training and ensuring procedural safety, particularly in tasks requiring high precision such as drug compounding.

摘要

(1)背景:随着数字健康技术的发展,准确的医学手套手跟踪在评估和培训从业者以减少临床环境中的程序错误方面变得越来越重要。(2)方法:本研究利用计算机视觉对手部姿势估计进行建模,以模拟原位无菌药物配制过程中的骨骼手部运动。高清摄像机记录了从业人员戴不同颜色手套时的手部运动。手动注释手部姿势,并使用 DeepLabCut 接口通过 80/20 的训练/测试分割开发和训练机器学习模型。(3)结果:开发的模型在训练数据集上的平均均方根误差(RMSE)为 5.89 像素,在测试数据集上为 10.06 像素。当排除置信值低于 60%的关键点时,测试集 RMSE 提高到 7.48 像素,反映了手部姿势跟踪的高度准确性。(4)结论:开发的手部姿势估计模型有效地跟踪了受控和原位药物配制环境中的手部运动,提供了一种首创的医学手套手部跟踪方法。该模型有可能增强临床培训并确保程序安全,特别是在需要高精度的任务(如药物配制)中。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/785390e2112b/sensors-24-06005-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/c8ba5675f1a4/sensors-24-06005-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/3f4deb364618/sensors-24-06005-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/cbf94e852b57/sensors-24-06005-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/85fbc4b44259/sensors-24-06005-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/6981298381b2/sensors-24-06005-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/785390e2112b/sensors-24-06005-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/c8ba5675f1a4/sensors-24-06005-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/3f4deb364618/sensors-24-06005-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/cbf94e852b57/sensors-24-06005-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/85fbc4b44259/sensors-24-06005-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/6981298381b2/sensors-24-06005-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e7d6/11435464/785390e2112b/sensors-24-06005-g006.jpg

相似文献

1
Enhanced 2D Hand Pose Estimation for Gloved Medical Applications: A Preliminary Model.增强型手套式医学应用二维手姿估计:初步模型。
Sensors (Basel). 2024 Sep 17;24(18):6005. doi: 10.3390/s24186005.
2
Estimating Ground Reaction Forces from Two-Dimensional Pose Data: A Biomechanics-Based Comparison of AlphaPose, BlazePose, and OpenPose.基于二维姿态数据估计地面反作用力:AlphaPose、BlazePose 和 OpenPose 的生物力学比较。
Sensors (Basel). 2022 Dec 21;23(1):78. doi: 10.3390/s23010078.
3
Video-based quantification of human movement frequency using pose estimation: A pilot study.基于视频的人体运动频率姿态估计量化:一项初步研究。
PLoS One. 2021 Dec 20;16(12):e0261450. doi: 10.1371/journal.pone.0261450. eCollection 2021.
4
Temporally guided articulated hand pose tracking in surgical videos.手术视频中基于时间引导的 articulated hand pose 跟踪。
Int J Comput Assist Radiol Surg. 2023 Jan;18(1):117-125. doi: 10.1007/s11548-022-02761-6. Epub 2022 Oct 3.
5
Wearable high-density EMG sleeve for complex hand gesture classification and continuous joint angle estimation.可穿戴式高密度肌电图袖套,用于复杂手势分类和连续关节角度估计。
Sci Rep. 2024 Aug 9;14(1):18564. doi: 10.1038/s41598-024-64458-x.
6
A Survey on Hand Pose Estimation with Wearable Sensors and Computer-Vision-Based Methods.基于可穿戴传感器和计算机视觉的手势估计研究综述。
Sensors (Basel). 2020 Feb 16;20(4):1074. doi: 10.3390/s20041074.
7
Towards automated video-based assessment of dystonia in dyskinetic cerebral palsy: A novel approach using markerless motion tracking and machine learning.迈向基于视频的运动障碍型脑瘫肌张力障碍自动评估:一种使用无标记运动跟踪和机器学习的新方法。
Front Robot AI. 2023 Mar 2;10:1108114. doi: 10.3389/frobt.2023.1108114. eCollection 2023.
8
DeepWild: Application of the pose estimation tool DeepLabCut for behaviour tracking in wild chimpanzees and bonobos.DeepWild:姿势估计工具 DeepLabCut 在野生黑猩猩和倭黑猩猩行为跟踪中的应用。
J Anim Ecol. 2023 Aug;92(8):1560-1574. doi: 10.1111/1365-2656.13932. Epub 2023 May 10.
9
A self-supervised spatio-temporal attention network for video-based 3D infant pose estimation.基于视频的 3D 婴儿姿态估计的自监督时空注意网络。
Med Image Anal. 2024 Aug;96:103208. doi: 10.1016/j.media.2024.103208. Epub 2024 May 18.
10
Synthesising 2D Video from 3D Motion Data for Machine Learning Applications.从 3D 运动数据合成 2D 视频用于机器学习应用。
Sensors (Basel). 2022 Aug 29;22(17):6522. doi: 10.3390/s22176522.

本文引用的文献

1
State of the Art in Immersive Interactive Technologies for Surgery Simulation: A Review and Prospective.用于手术模拟的沉浸式交互技术的现状:综述与展望
Bioengineering (Basel). 2023 Nov 23;10(12):1346. doi: 10.3390/bioengineering10121346.
2
Real-Time Monocular Skeleton-Based Hand Gesture Recognition Using 3D-Jointsformer.基于 3D-Jointsformer 的实时单目手部骨骼手势识别。
Sensors (Basel). 2023 Aug 10;23(16):7066. doi: 10.3390/s23167066.
3
Tracking and evaluating motion skills in laparoscopy with inertial sensors.利用惯性传感器追踪和评估腹腔镜手术中的运动技能。
Surg Endosc. 2023 Jul;37(7):5274-5284. doi: 10.1007/s00464-023-09983-y. Epub 2023 Mar 28.
4
HMD-EgoPose: head-mounted display-based egocentric marker-less tool and hand pose estimation for augmented surgical guidance.HMD-EgoPose:基于头戴式显示器的自我中心无标记工具和手部姿势估计,用于增强手术指导。
Int J Comput Assist Radiol Surg. 2022 Dec;17(12):2253-2262. doi: 10.1007/s11548-022-02688-y. Epub 2022 Jun 14.
5
Robust hand tracking for surgical telestration.稳健的手部跟踪用于手术遥绘。
Int J Comput Assist Radiol Surg. 2022 Aug;17(8):1477-1486. doi: 10.1007/s11548-022-02637-9. Epub 2022 May 27.
6
Multi-animal pose estimation, identification and tracking with DeepLabCut.多动物姿态估计、识别和跟踪的 DeepLabCut 方法
Nat Methods. 2022 Apr;19(4):496-504. doi: 10.1038/s41592-022-01443-0. Epub 2022 Apr 12.
7
Hand Gesture Recognition Based on Computer Vision: A Review of Techniques.基于计算机视觉的手势识别:技术综述
J Imaging. 2020 Jul 23;6(8):73. doi: 10.3390/jimaging6080073.
8
Towards markerless surgical tool and hand pose estimation.面向无标记手术工具和手部姿势估计。
Int J Comput Assist Radiol Surg. 2021 May;16(5):799-808. doi: 10.1007/s11548-021-02369-2. Epub 2021 Apr 21.
9
A Novel Suture Training System for Open Surgery Replicating Procedures Performed by Experts Using Augmented Reality.一种用于开放手术的新型缝合训练系统,利用增强现实技术复制专家执行的手术操作。
J Med Syst. 2021 Apr 7;45(5):60. doi: 10.1007/s10916-021-01735-6.
10
Analysis of Kinematic Differences in Hand Motion between Novice and Experienced Operators in IR: A Pilot Study.新手和经验丰富操作者在介入放射学中手部运动运动学差异的分析:一项初步研究。
J Vasc Interv Radiol. 2021 Feb;32(2):226-234. doi: 10.1016/j.jvir.2020.10.010. Epub 2020 Dec 16.