Gou Shuiping, Fu Jiahui, Sha Yu, Cao Zhen, Guo Zhang, Eshraghian Jason K, Li Ruimin, Jiao Licheng
Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi'an, China.
Department of Electrical and Computer Engineering, University of California, Santa Cruz, Santa Cruz, CA, United States.
Front Neurosci. 2025 Mar 25;19:1545583. doi: 10.3389/fnins.2025.1545583. eCollection 2025.
Spiking neural networks (SNNs), which draw from biological neuron models, have the potential to improve the computational efficiency of artificial neural networks (ANNs) due to their event-driven nature and sparse data flow. SNNs rely on dynamical sparsity, in that neurons are trained to activate sparsely to minimize data communication. This is critical when accounting for hardware given the bandwidth limitations between memory and processor. Given that neurons are sparsely activated, weights are less frequently accessed, and potentially can be pruned to less performance degradation in a SNN compared to an equivalent ANN counterpart. Reducing the number of synaptic connections between neurons also relaxes memory demands for neuromorphic processors. In this paper, we propose a spatio-temporal pruning algorithm that dynamically adapts to reduce the temporal redundancy that often exists in SNNs when processing Dynamic Vision Sensor (DVS) datasets. Spatial pruning is executed based on both global parameter statistics and inter-layer parameter count and is shown to reduce model degradation under extreme sparsity. We provide an ablation study that isolates the various components of spatio-temporal pruning, and find that our approach achieves excellent performance across all datasets, with especially high performance on datasets with time-varying features. We achieved a 0.69% improvement on the DVS128 Gesture dataset, despite the common expectation that pruning typically degrades performance. Notably, this enhancement comes with an impressive 98.18% reduction in parameter space and a 50% reduction in time redundancy.
脉冲神经网络(SNN)借鉴了生物神经元模型,由于其事件驱动的特性和稀疏的数据流,有潜力提高人工神经网络(ANN)的计算效率。SNN依赖于动态稀疏性,即神经元被训练为稀疏激活以最小化数据通信。考虑到内存和处理器之间的带宽限制,在涉及硬件时这一点至关重要。鉴于神经元是稀疏激活的,权重的访问频率较低,并且与等效的ANN相比,在SNN中权重可能被修剪而性能下降较小。减少神经元之间的突触连接数量也减轻了对神经形态处理器的内存需求。在本文中,我们提出了一种时空修剪算法,该算法可动态适应以减少在处理动态视觉传感器(DVS)数据集时SNN中经常存在的时间冗余。基于全局参数统计和层间参数计数执行空间修剪,并显示在极端稀疏情况下可减少模型退化。我们提供了一项消融研究,隔离了时空修剪的各个组件,发现我们的方法在所有数据集上都取得了优异的性能,在具有时变特征的数据集上表现尤其出色。尽管通常认为修剪会降低性能,但我们在DVS128手势数据集上实现了0.69%的性能提升。值得注意的是,这种提升伴随着参数空间令人印象深刻的98.18%的减少和时间冗余50%的减少。