Kiani Fatemeh, Yin Jun, Wang Zhongrui, Yang J Joshua, Xia Qiangfei
Department of Electrical and Computer Engineering, University of Massachusetts Amherst, Amherst, MA 01003, USA.
Sci Adv. 2021 Nov 26;7(48):eabj4801. doi: 10.1126/sciadv.abj4801. Epub 2021 Nov 24.
Memristive crossbar arrays promise substantial improvements in computing throughput and power efficiency through in-memory analog computing. Previous machine learning demonstrations with memristive arrays, however, relied on software or digital processors to implement some critical functionalities, leading to frequent analog/digital conversions and more complicated hardware that compromises the energy efficiency and computing parallelism. Here, we show that, by implementing the activation function of a neural network in analog hardware, analog signals can be transmitted to the next layer without unnecessary digital conversion, communication, and processing. We have designed and built compact rectified linear units, with which we constructed a two-layer perceptron using memristive crossbar arrays, and demonstrated a recognition accuracy of 93.63% for the Modified National Institute of Standard and Technology (MNIST) handwritten digits dataset. The fully hardware-based neural network reduces both the data shuttling and conversion, capable of delivering much higher computing throughput and power efficiency.
忆阻交叉阵列有望通过内存模拟计算大幅提高计算吞吐量和功率效率。然而,先前使用忆阻阵列进行的机器学习演示依赖软件或数字处理器来实现一些关键功能,这导致频繁的模拟/数字转换以及更复杂的硬件,从而损害了能源效率和计算并行性。在此,我们表明,通过在模拟硬件中实现神经网络的激活函数,模拟信号可以在无需不必要的数字转换、通信和处理的情况下传输到下一层。我们设计并构建了紧凑的整流线性单元,并用其使用忆阻交叉阵列构建了一个两层感知器,并在修改后的美国国家标准与技术研究院(MNIST)手写数字数据集上展示了93.63%的识别准确率。全硬件神经网络减少了数据传输和转换,能够实现更高的计算吞吐量和功率效率。