Suppr超能文献

探索二进制神经网络与脉冲神经网络之间的联系。

Exploring the Connection Between Binary and Spiking Neural Networks.

作者信息

Lu Sen, Sengupta Abhronil

机构信息

School of Electrical Engineering and Computer Science, The Pennsylvania State University, University Park, PA, United States.

出版信息

Front Neurosci. 2020 Jun 24;14:535. doi: 10.3389/fnins.2020.00535. eCollection 2020.

Abstract

On-chip edge intelligence has necessitated the exploration of algorithmic techniques to reduce the compute requirements of current machine learning frameworks. This work aims to bridge the recent algorithmic progress in training Binary Neural Networks and Spiking Neural Networks-both of which are driven by the same motivation and yet synergies between the two have not been fully explored. We show that training Spiking Neural Networks in the extreme quantization regime results in near full precision accuracies on large-scale datasets like CIFAR-100 and ImageNet. An important implication of this work is that Binary Spiking Neural Networks can be enabled by "In-Memory" hardware accelerators catered for Binary Neural Networks without suffering any accuracy degradation due to binarization. We utilize standard training techniques for non-spiking networks to generate our spiking networks by conversion process and also perform an extensive empirical analysis and explore simple design-time and run-time optimization techniques for reducing inference latency of spiking networks (both for binary and full-precision models) by an order of magnitude over prior work. Our implementation source code and trained models are available at https://github.com/NeuroCompLab-psu/SNN-Conversion.

摘要

片上边缘智能促使人们探索算法技术,以降低当前机器学习框架的计算需求。这项工作旨在弥合近期在训练二进制神经网络和脉冲神经网络方面的算法进展,这两者都出于相同的动机,但尚未充分探索两者之间的协同作用。我们表明,在极端量化模式下训练脉冲神经网络,在CIFAR-100和ImageNet等大规模数据集上可实现接近全精度的准确率。这项工作的一个重要意义在于,“内存中”的硬件加速器可支持二进制脉冲神经网络,该加速器专为二进制神经网络设计,不会因二值化而导致任何精度下降。我们利用非脉冲网络的标准训练技术,通过转换过程生成脉冲网络,并进行了广泛的实证分析,探索了简单的设计时和运行时优化技术,以将脉冲网络(包括二进制和全精度模型)的推理延迟比先前工作降低一个数量级。我们的实现源代码和训练模型可在https://github.com/NeuroCompLab-psu/SNN-Conversion获取。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a69/7327094/d9ee0b2e92c8/fnins-14-00535-g0001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验