DeWolf Travis, Jaworski Pawel, Eliasmith Chris
Applied Brain Research, Waterloo, ON, Canada.
Centre for Theoretical Neuroscience, University of Waterloo, Waterloo, ON, Canada.
Front Neurorobot. 2020 Oct 9;14:568359. doi: 10.3389/fnbot.2020.568359. eCollection 2020.
In this paper we demonstrate how the Nengo neural modeling and simulation libraries enable users to quickly develop robotic perception and action neural networks for simulation on neuromorphic hardware using tools they are already familiar with, such as Keras and Python. We identify four primary challenges in building robust, embedded neurorobotic systems, including: (1) developing infrastructure for interfacing with the environment and sensors; (2) processing task specific sensory signals; (3) generating robust, explainable control signals; and (4) compiling neural networks to run on target hardware. Nengo helps to address these challenges by: (1) providing the NengoInterfaces library, which defines a simple but powerful API for users to interact with simulations and hardware; (2) providing the NengoDL library, which lets users use the Keras and TensorFlow API to develop Nengo models; (3) implementing the Neural Engineering Framework, which provides white-box methods for implementing known functions and circuits; and (4) providing multiple backend libraries, such as NengoLoihi, that enable users to compile the same model to different hardware. We present two examples using Nengo to develop neural networks that run on CPUs and GPUs as well as Intel's neuromorphic chip, Loihi, to demonstrate two variations on this workflow. The first example is an implementation of an end-to-end spiking neural network in Nengo that controls a rover simulated in Mujoco. The network integrates a deep convolutional network that processes visual input from cameras mounted on the rover to track a target, and a control system implementing steering and drive functions in connection weights to guide the rover to the target. The second example uses Nengo as a smaller component in a system that has addressed some but not all of those challenges. Specifically it is used to augment a force-based operational space controller with neural adaptive control to improve performance during a reaching task using a real-world Kinova Jaco robotic arm. The code and implementation details are provided, with the intent of enabling other researchers to build and run their own neurorobotic systems.
在本文中,我们展示了Nengo神经建模和仿真库如何使用户能够利用他们已经熟悉的工具(如Keras和Python),快速开发用于在神经形态硬件上进行仿真的机器人感知和动作神经网络。我们确定了构建强大的嵌入式神经机器人系统时面临的四个主要挑战,包括:(1)开发与环境和传感器接口的基础设施;(2)处理特定任务的感官信号;(3)生成强大的、可解释的控制信号;以及(4)编译神经网络以在目标硬件上运行。Nengo通过以下方式帮助应对这些挑战:(1)提供NengoInterfaces库,该库定义了一个简单但功能强大的API,供用户与仿真和硬件进行交互;(2)提供NengoDL库,该库允许用户使用Keras和TensorFlow API来开发Nengo模型;(3)实现神经工程框架,该框架提供用于实现已知功能和电路的白盒方法;以及(4)提供多个后端库,如NengoLoihi,使用户能够将同一模型编译到不同的硬件上。我们给出了两个使用Nengo开发在CPU、GPU以及英特尔神经形态芯片Loihi上运行的神经网络的示例,以展示此工作流程的两种变体。第一个示例是在Nengo中实现的一个端到端脉冲神经网络,用于控制在Mujoco中模拟的漫游车。该网络集成了一个深度卷积网络,用于处理来自安装在漫游车上的摄像头的视觉输入以跟踪目标,以及一个控制系统,该系统在连接权重中实现转向和驱动功能,以引导漫游车到达目标。第二个示例将Nengo用作一个系统中的较小组件,该系统解决了部分但并非全部这些挑战。具体而言,它用于通过神经自适应控制增强基于力的操作空间控制器,以在使用现实世界的Kinova Jaco机器人手臂进行伸手任务时提高性能。文中提供了代码和实现细节,目的是让其他研究人员能够构建和运行他们自己的神经机器人系统。