Suppr超能文献

脑机接口与视觉引导自主机器人技术的融合可提高抓握过程中神经假肢手臂的性能。

Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping.

作者信息

Downey John E, Weiss Jeffrey M, Muelling Katharina, Venkatraman Arun, Valois Jean-Sebastien, Hebert Martial, Bagnell J Andrew, Schwartz Andrew B, Collinger Jennifer L

机构信息

Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA.

Center for the Neural Basis of Cognition, Pittsburgh, PA, USA.

出版信息

J Neuroeng Rehabil. 2016 Mar 18;13:28. doi: 10.1186/s12984-016-0134-9.

Abstract

BACKGROUND

Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object.

METHODS

Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps.

RESULTS

Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control.

CONCLUSIONS

Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users.

TRIAL REGISTRATION

NCT01364480 and NCT01894802 .

摘要

背景

近期研究表明,脑机接口(BMI)在恢复上肢功能方面具有巨大潜力。然而,抓取物体是一项复杂的任务,从大脑提取的信号可能并不总能可靠地驱动这些动作。视觉引导的机器人辅助是提高BMI性能的一种可能方法。我们描述了一种共享控制方法,即用户使用BMI控制假肢手臂,并在手部接近物体时获得定位辅助。

方法

两名四肢瘫痪的人类受试者使用机器人手臂在有和没有共享控制的情况下完成物体运输任务。共享控制系统旨在在BMI衍生的意图和计算机辅助之间取得平衡。一个自主机器人抓取系统识别并跟踪物体,并为这些物体定义稳定的抓取位置。该系统根据机器人手臂的BMI控制动作识别用户何时打算与物体交互。通过共享控制,将BMI控制的动作和自主抓取命令融合在一起,以确保安全抓取。

结果

与仅使用BMI控制相比,两名受试者在使用共享控制时在物体转移任务上更成功。使用共享控制进行的动作更准确、更高效且难度更小。一名参与者尝试了一项涉及多个物体的任务,并在92%的试验中成功举起了两个紧密间隔物体中的一个,这表明用户在使用共享控制时能够准确执行其意图。

结论

BMI控制与视觉引导的机器人辅助相结合可提高物体转移任务的性能。在保持通用性的同时提供辅助将使BMI系统对潜在用户更具吸引力。

试验注册

NCT01364480和NCT01894802 。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb6c/4797113/37f7d425c0e8/12984_2016_134_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验