Shi Yuhan, Nguyen Leon, Oh Sangheon, Liu Xin, Kuzum Duygu
Electrical and Computer Engineering Department, University of California, San Diego, San Diego, CA, United States.
Front Neurosci. 2019 Apr 26;13:405. doi: 10.3389/fnins.2019.00405. eCollection 2019.
Inspired from the computational efficiency of the biological brain, spiking neural networks (SNNs) emulate biological neural networks, neural codes, dynamics, and circuitry. SNNs show great potential for the implementation of unsupervised learning using in-memory computing. Here, we report an algorithmic optimization that improves energy efficiency of online learning with SNNs on emerging non-volatile memory (eNVM) devices. We develop a pruning method for SNNs by exploiting the output firing characteristics of neurons. Our pruning method can be applied during network training, which is different from previous approaches in the literature that employ pruning on already-trained networks. This approach prevents unnecessary updates of network parameters during training. This algorithmic optimization can complement the energy efficiency of eNVM technology, which offers a unique in-memory computing platform for the parallelization of neural network operations. Our SNN maintains ~90% classification accuracy on the MNIST dataset with up to ~75% pruning, significantly reducing the number of weight updates. The SNN and pruning scheme developed in this work can pave the way toward applications of eNVM based neuro-inspired systems for energy efficient online learning in low power applications.
受生物大脑计算效率的启发,脉冲神经网络(SNN)模拟生物神经网络、神经编码、动力学和电路。SNN在使用内存计算实现无监督学习方面显示出巨大潜力。在此,我们报告一种算法优化方法,可提高在新兴非易失性存储器(eNVM)设备上使用SNN进行在线学习的能源效率。我们通过利用神经元的输出放电特性开发了一种SNN剪枝方法。我们的剪枝方法可在网络训练期间应用,这与文献中之前在已训练网络上进行剪枝的方法不同。这种方法可防止训练期间网络参数的不必要更新。这种算法优化可补充eNVM技术的能源效率,该技术为神经网络操作的并行化提供了独特的内存计算平台。我们的SNN在MNIST数据集上保持约90%的分类准确率,剪枝率高达约75%,显著减少了权重更新次数。这项工作中开发的SNN和剪枝方案可为基于eNVM的神经启发系统在低功耗应用中的节能在线学习应用铺平道路。