Date Prasanna, Kulkarni Shruti, Young Aaron, Schuman Catherine, Potok Thomas, Vetter Jeffrey
Oak Ridge National Laboratory, Oak Ridge, TN, 37830, USA.
University of Tennessee, Knoxville, TN, 37996, USA.
Sci Rep. 2023 Jul 6;13(1):10975. doi: 10.1038/s41598-023-35005-x.
Neuromorphic computers emulate the human brain while being extremely power efficient for computing tasks. In fact, they are poised to be critical for energy-efficient computing in the future. Neuromorphic computers are primarily used in spiking neural network-based machine learning applications. However, they are known to be Turing-complete, and in theory can perform all general-purpose computation. One of the biggest bottlenecks in realizing general-purpose computations on neuromorphic computers today is the inability to efficiently encode data on the neuromorphic computers. To fully realize the potential of neuromorphic computers for energy-efficient general-purpose computing, efficient mechanisms must be devised for encoding numbers. Current encoding mechanisms (e.g., binning, rate-based encoding, and time-based encoding) have limited applicability and are not suited for general-purpose computation. In this paper, we present the virtual neuron abstraction as a mechanism for encoding and adding integers and rational numbers by using spiking neural network primitives. We evaluate the performance of the virtual neuron on physical and simulated neuromorphic hardware. We estimate that the virtual neuron could perform an addition operation using just 23 nJ of energy on average with a mixed-signal, memristor-based neuromorphic processor. We also demonstrate the utility of the virtual neuron by using it in some of the μ-recursive functions, which are the building blocks of general-purpose computation.
神经形态计算机在模拟人类大脑的同时,对于计算任务具有极高的能源效率。事实上,它们有望在未来的节能计算中发挥关键作用。神经形态计算机主要用于基于脉冲神经网络的机器学习应用。然而,它们已知是图灵完备的,理论上可以执行所有通用计算。当今在神经形态计算机上实现通用计算的最大瓶颈之一是无法在神经形态计算机上高效地编码数据。为了充分实现神经形态计算机在节能通用计算方面的潜力,必须设计出高效的数字编码机制。当前的编码机制(例如,装箱、基于速率的编码和基于时间的编码)适用性有限,不适用于通用计算。在本文中,我们提出了虚拟神经元抽象,作为一种通过使用脉冲神经网络原语来编码和添加整数与有理数的机制。我们在物理和模拟神经形态硬件上评估了虚拟神经元的性能。我们估计,在基于混合信号、忆阻器的神经形态处理器上,虚拟神经元平均仅使用23纳焦的能量就能执行一次加法运算。我们还通过在一些通用计算的构建块——μ递归函数中使用虚拟神经元,展示了它的实用性。