IEEE Trans Neural Netw Learn Syst. 2021 Jul;32(7):2809-2824. doi: 10.1109/TNNLS.2020.3009047. Epub 2021 Jul 6.
Neural networks (NNs) are effective machine learning models that require significant hardware and energy consumption in their computing process. To implement NNs, stochastic computing (SC) has been proposed to achieve a tradeoff between hardware efficiency and computing performance. In an SC NN, hardware requirements and power consumption are significantly reduced by moderately sacrificing the inference accuracy and computation speed. With recent developments in SC techniques, however, the performance of SC NNs has substantially been improved, making it comparable with conventional binary designs yet by utilizing less hardware. In this article, we begin with the design of a basic SC neuron and then survey different types of SC NNs, including multilayer perceptrons, deep belief networks, convolutional NNs, and recurrent NNs. Recent progress in SC designs that further improve the hardware efficiency and performance of NNs is subsequently discussed. The generality and versatility of SC NNs are illustrated for both the training and inference processes. Finally, the advantages and challenges of SC NNs are discussed with respect to binary counterparts.
神经网络 (NNs) 是有效的机器学习模型,在计算过程中需要大量的硬件和能源消耗。为了实现神经网络,已经提出了随机计算 (SC) 来在硬件效率和计算性能之间进行权衡。在 SC 神经网络中,通过适度牺牲推断精度和计算速度,可以显著降低硬件要求和功耗。然而,随着 SC 技术的最新发展,SC 神经网络的性能得到了极大的提高,使其与传统的二进制设计相比,硬件利用率更低,但性能相当。在本文中,我们首先设计了一个基本的 SC 神经元,然后调查了不同类型的 SC 神经网络,包括多层感知机、深度置信网络、卷积神经网络和递归神经网络。随后讨论了进一步提高神经网络硬件效率和性能的 SC 设计的最新进展。SC 神经网络的通用性和多功能性在训练和推断过程中都得到了说明。最后,讨论了 SC 神经网络相对于二进制神经网络的优势和挑战。