• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

截肢者与健全人在抓握过程中的视觉运动行为

On the Visuomotor Behavior of Amputees and Able-Bodied People During Grasping.

作者信息

Gregori Valentina, Cognolato Matteo, Saetta Gianluca, Atzori Manfredo, Gijsberts Arjan

机构信息

Department of Computer, Control, and Management Engineering, University of Rome La Sapienza, Rome, Italy.

VANDAL Laboratory, Istituto Italiano di Tecnologia, Genoa, Italy.

出版信息

Front Bioeng Biotechnol. 2019 Nov 15;7:316. doi: 10.3389/fbioe.2019.00316. eCollection 2019.

DOI:10.3389/fbioe.2019.00316
PMID:31799243
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6874164/
Abstract

Visual attention is often predictive for future actions in humans. In manipulation tasks, the eyes tend to fixate an object of interest even before the reach-to-grasp is initiated. Some recent studies have proposed to exploit this anticipatory gaze behavior to improve the control of dexterous upper limb prostheses. This requires a detailed understanding of visuomotor coordination to determine in which temporal window gaze may provide helpful information. In this paper, we verify and quantify the gaze and motor behavior of 14 transradial amputees who were asked to grasp and manipulate common household objects with their missing limb. For comparison, we also include data from 30 able-bodied subjects who executed the same protocol with their right arm. The dataset contains gaze, first person video, angular velocities of the head, and electromyography and accelerometry of the forearm. To analyze the large amount of video, we developed a procedure based on recent deep learning methods to automatically detect and segment all objects of interest. This allowed us to accurately determine the pixel distances between the gaze point, the target object, and the limb in each individual frame. Our analysis shows a clear coordination between the eyes and the limb in the reach-to-grasp phase, confirming that both intact and amputated subjects precede the grasp with their eyes by more than 500 ms. Furthermore, we note that the gaze behavior of amputees was remarkably similar to that of the able-bodied control group, despite their inability to physically manipulate the objects.

摘要

视觉注意力通常可预测人类未来的行动。在操作任务中,眼睛甚至在开始伸手抓取动作之前就倾向于注视感兴趣的物体。最近的一些研究提议利用这种预期的注视行为来改善灵巧上肢假肢的控制。这需要详细了解视觉运动协调,以确定在哪个时间窗口注视可能提供有用信息。在本文中,我们对14名经桡骨截肢者的注视和运动行为进行了验证和量化,这些截肢者被要求用缺失的肢体抓取和操作常见的家居用品。为了进行比较,我们还纳入了30名健全受试者用右臂执行相同实验方案的数据。该数据集包含注视、第一人称视频、头部角速度以及前臂的肌电图和加速度测量数据。为了分析大量视频,我们基于最近的深度学习方法开发了一种程序,以自动检测和分割所有感兴趣的物体。这使我们能够准确确定每个单独帧中注视点、目标物体和肢体之间的像素距离。我们的分析表明,在伸手抓取阶段,眼睛和肢体之间存在明显的协调,证实健全受试者和截肢受试者在抓取动作之前眼睛注视的时间都超过500毫秒。此外,我们注意到,尽管截肢者无法实际操作物体,但其注视行为与健全对照组非常相似。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/3065336645fc/fbioe-07-00316-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/2d0ca34748b8/fbioe-07-00316-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/3c61c5cf7beb/fbioe-07-00316-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/ab7b92889e17/fbioe-07-00316-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/2e12e2b27a77/fbioe-07-00316-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/c285164ade0a/fbioe-07-00316-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/cf6e8792c052/fbioe-07-00316-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/b336364a38a4/fbioe-07-00316-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/3065336645fc/fbioe-07-00316-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/2d0ca34748b8/fbioe-07-00316-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/3c61c5cf7beb/fbioe-07-00316-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/ab7b92889e17/fbioe-07-00316-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/2e12e2b27a77/fbioe-07-00316-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/c285164ade0a/fbioe-07-00316-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/cf6e8792c052/fbioe-07-00316-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/b336364a38a4/fbioe-07-00316-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fa12/6874164/3065336645fc/fbioe-07-00316-g0008.jpg

相似文献

1
On the Visuomotor Behavior of Amputees and Able-Bodied People During Grasping.截肢者与健全人在抓握过程中的视觉运动行为
Front Bioeng Biotechnol. 2019 Nov 15;7:316. doi: 10.3389/fbioe.2019.00316. eCollection 2019.
2
Improving Robotic Hand Prosthesis Control With Eye Tracking and Computer Vision: A Multimodal Approach Based on the Visuomotor Behavior of Grasping.利用眼动追踪和计算机视觉改善机器人手部假肢控制:一种基于抓握视觉运动行为的多模态方法。
Front Artif Intell. 2022 Jan 25;4:744476. doi: 10.3389/frai.2021.744476. eCollection 2021.
3
Gaze, visual, myoelectric, and inertial data of grasps for intelligent prosthetics.智能假肢的注视、视觉、肌电和惯性抓握数据。
Sci Data. 2020 Feb 10;7(1):43. doi: 10.1038/s41597-020-0380-3.
4
Gaze, behavioral, and clinical data for phantom limbs after hand amputation from 15 amputees and 29 controls.15 名截肢者和 29 名对照者的手部截肢后幻肢的凝视、行为和临床数据。
Sci Data. 2020 Feb 20;7(1):60. doi: 10.1038/s41597-020-0402-1.
5
A scoping review of eye tracking metrics used to assess visuomotor behaviours of upper limb prosthesis users.使用眼动追踪指标评估上肢假肢使用者视动行为的范围综述。
J Neuroeng Rehabil. 2023 Apr 24;20(1):49. doi: 10.1186/s12984-023-01180-1.
6
Gaze-grasp coordination in obstacle avoidance: differences between binocular and monocular viewing.避障中的注视-抓握协调:双眼和单眼视觉之间的差异
Exp Brain Res. 2015 Dec;233(12):3489-505. doi: 10.1007/s00221-015-4421-7. Epub 2015 Aug 23.
7
Visuomotor behaviours when using a myoelectric prosthesis.使用肌电假体时的视动行为。
J Neuroeng Rehabil. 2014 Apr 23;11:72. doi: 10.1186/1743-0003-11-72.
8
Gaze and Movement Assessment (GaMA): Inter-site validation of a visuomotor upper limb functional protocol.注视与运动评估(GaMA):一种视动性上肢功能协议的现场间验证。
PLoS One. 2019 Dec 30;14(12):e0219333. doi: 10.1371/journal.pone.0219333. eCollection 2019.
9
Non-Invasive, Temporally Discrete Feedback of Object Contact and Release Improves Grasp Control of Closed-Loop Myoelectric Transradial Prostheses.物体接触和释放的无创、时间离散反馈可改善闭环肌电经桡动脉假肢的抓握控制。
IEEE Trans Neural Syst Rehabil Eng. 2016 Dec;24(12):1314-1322. doi: 10.1109/TNSRE.2015.2500586. Epub 2015 Nov 13.
10
Eye movements do not play an important role in the adaptation of hand tracking to a visuomotor rotation.眼球运动在手跟踪适应视动旋转中不起重要作用。
J Neurophysiol. 2019 May 1;121(5):1967-1976. doi: 10.1152/jn.00814.2018. Epub 2019 Apr 3.

引用本文的文献

1
Using eye tracking to assess learning of a multifunction prosthetic hand: an exploratory study from a rehabilitation perspective.利用眼动追踪评估多功能假肢手的学习情况:从康复角度的探索性研究。
J Neuroeng Rehabil. 2024 Aug 31;21(1):148. doi: 10.1186/s12984-024-01445-3.
2
A scoping review of eye tracking metrics used to assess visuomotor behaviours of upper limb prosthesis users.使用眼动追踪指标评估上肢假肢使用者视动行为的范围综述。
J Neuroeng Rehabil. 2023 Apr 24;20(1):49. doi: 10.1186/s12984-023-01180-1.
3
Context matters during pick-and-place in VR: Impact on search and transport phases.

本文引用的文献

1
Quantitative Eye Gaze and Movement Differences in Visuomotor Adaptations to Varying Task Demands Among Upper-Extremity Prosthesis Users.上肢假肢使用者在不同任务需求下的视觉运动适应中的定量眼球注视和运动差异。
JAMA Netw Open. 2019 Sep 4;2(9):e1911197. doi: 10.1001/jamanetworkopen.2019.11197.
2
Visual attention, EEG alpha power and T7-Fz connectivity are implicated in prosthetic hand control and can be optimized through gaze training.视觉注意力、脑电图阿尔法功率和 T7-Fz 连通性与假肢手控制有关,可通过注视训练进行优化。
J Neuroeng Rehabil. 2019 Apr 27;16(1):52. doi: 10.1186/s12984-019-0524-x.
3
Gaze when reaching to grasp a glass.
在虚拟现实中进行拾取和放置操作时,情境很重要:对搜索和运输阶段的影响。
Front Psychol. 2022 Sep 8;13:881269. doi: 10.3389/fpsyg.2022.881269. eCollection 2022.
4
Improving Robotic Hand Prosthesis Control With Eye Tracking and Computer Vision: A Multimodal Approach Based on the Visuomotor Behavior of Grasping.利用眼动追踪和计算机视觉改善机器人手部假肢控制:一种基于抓握视觉运动行为的多模态方法。
Front Artif Intell. 2022 Jan 25;4:744476. doi: 10.3389/frai.2021.744476. eCollection 2021.
5
Gesture Recognition Using Surface Electromyography and Deep Learning for Prostheses Hand: State-of-the-Art, Challenges, and Future.使用表面肌电图和深度学习的假肢手势识别:现状、挑战与未来
Front Neurosci. 2021 Apr 26;15:621885. doi: 10.3389/fnins.2021.621885. eCollection 2021.
伸手去拿杯子时的注视。
J Vis. 2018 Aug 1;18(8):16. doi: 10.1167/18.8.16.
4
Using synchronized eye and motion tracking to determine high-precision eye-movement patterns during object-interaction tasks.使用同步的眼睛和运动跟踪来确定物体交互任务期间的高精度眼动模式。
J Vis. 2018 Jun 1;18(6):18. doi: 10.1167/18.6.18.
5
The clinical relevance of advanced artificial feedback in the control of a multi-functional myoelectric prosthesis.高级人工反馈在多功能肌电假肢控制中的临床相关性。
J Neuroeng Rehabil. 2018 Mar 27;15(1):28. doi: 10.1186/s12984-018-0371-1.
6
Illusory movement perception improves motor control for prosthetic hands.幻动感知可改善假肢手的运动控制。
Sci Transl Med. 2018 Mar 14;10(432). doi: 10.1126/scitranslmed.aao6990.
7
Examining the Spatiotemporal Disruption to Gaze When Using a Myoelectric Prosthetic Hand.使用肌电假手时注视的时空干扰研究
J Mot Behav. 2018 Jul-Aug;50(4):416-425. doi: 10.1080/00222895.2017.1363703. Epub 2017 Sep 19.
8
The Reality of Myoelectric Prostheses: Understanding What Makes These Devices Difficult for Some Users to Control.肌电假肢的现实情况:理解为何这些设备对一些使用者来说难以控制。
Front Neurorobot. 2016 Aug 22;10:7. doi: 10.3389/fnbot.2016.00007. eCollection 2016.
9
It's in the eyes: Planning precise manual actions before execution.答案就在眼睛里:在执行之前规划精确的手动动作。
J Vis. 2016;16(1):18. doi: 10.1167/16.1.18.
10
Phantom hand and wrist movements in upper limb amputees are slow but naturally controlled movements.上肢截肢者的幻手和腕部运动虽缓慢,但却是自然可控的运动。
Neuroscience. 2016 Jan 15;312:48-57. doi: 10.1016/j.neuroscience.2015.11.007. Epub 2015 Nov 10.