Velliste Meel, Perel Sagi, Spalding M Chance, Whitford Andrew S, Schwartz Andrew B
Department of Neurobiology, School of Medicine, E1440 BST, Lothrop Street, University of Pittsburgh, Pittsburgh, Pennsylvania 15213, USA.
Nature. 2008 Jun 19;453(7198):1098-101. doi: 10.1038/nature06996. Epub 2008 May 28.
Arm movement is well represented in populations of neurons recorded from the motor cortex. Cortical activity patterns have been used in the new field of brain-machine interfaces to show how cursors on computer displays can be moved in two- and three-dimensional space. Although the ability to move a cursor can be useful in its own right, this technology could be applied to restore arm and hand function for amputees and paralysed persons. However, the use of cortical signals to control a multi-jointed prosthetic device for direct real-time interaction with the physical environment ('embodiment') has not been demonstrated. Here we describe a system that permits embodied prosthetic control; we show how monkeys (Macaca mulatta) use their motor cortical activity to control a mechanized arm replica in a self-feeding task. In addition to the three dimensions of movement, the subjects' cortical signals also proportionally controlled a gripper on the end of the arm. Owing to the physical interaction between the monkey, the robotic arm and objects in the workspace, this new task presented a higher level of difficulty than previous virtual (cursor-control) experiments. Apart from an example of simple one-dimensional control, previous experiments have lacked physical interaction even in cases where a robotic arm or hand was included in the control loop, because the subjects did not use it to interact with physical objects-an interaction that cannot be fully simulated. This demonstration of multi-degree-of-freedom embodied prosthetic control paves the way towards the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level.
在从运动皮层记录的神经元群体中,手臂运动得到了很好的体现。皮层活动模式已被应用于脑机接口这一新领域,以展示计算机显示器上的光标如何在二维和三维空间中移动。虽然移动光标的能力本身可能很有用,但这项技术可用于恢复截肢者和瘫痪者的手臂和手部功能。然而,利用皮层信号来控制多关节假肢装置以与物理环境进行直接实时交互(“具身化”)尚未得到证实。在这里,我们描述了一种允许具身化假肢控制的系统;我们展示了猴子(恒河猴)如何利用它们的运动皮层活动在自我进食任务中控制一个机械手臂复制品。除了运动的三个维度外,受试者的皮层信号还按比例控制手臂末端的一个抓手。由于猴子、机器人手臂和工作空间中的物体之间的物理交互,这项新任务比以前的虚拟(光标控制)实验呈现出更高的难度水平。除了一个简单的一维控制示例外,以前的实验即使在控制回路中包含机器人手臂或手的情况下也缺乏物理交互,因为受试者没有用它与物理物体进行交互——这种交互是无法完全模拟的。这种多自由度具身化假肢控制的演示为开发最终能够在接近自然水平上实现手臂和手部功能的灵巧假肢装置铺平了道路。