• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

LIAF-Net:用于轻量级和高效时空信息处理的漏积分和模拟火灾网络。

LIAF-Net: Leaky Integrate and Analog Fire Network for Lightweight and Efficient Spatiotemporal Information Processing.

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Nov;33(11):6249-6262. doi: 10.1109/TNNLS.2021.3073016. Epub 2022 Oct 27.

DOI:10.1109/TNNLS.2021.3073016
PMID:33979292
Abstract

Spiking neural networks (SNNs) based on the leaky integrate and fire (LIF) model have been applied to energy-efficient temporal and spatiotemporal processing tasks. Due to the bioplausible neuronal dynamics and simplicity, LIF-SNN benefits from event-driven processing, however, usually face the embarrassment of reduced performance. This may because, in LIF-SNN, the neurons transmit information via spikes. To address this issue, in this work, we propose a leaky integrate and analog fire (LIAF) neuron model so that analog values can be transmitted among neurons, and a deep network termed LIAF-Net is built on it for efficient spatiotemporal processing. In the temporal domain, LIAF follows the traditional LIF dynamics to maintain its temporal processing capability. In the spatial domain, LIAF is able to integrate spatial information through convolutional integration or fully connected integration. As a spatiotemporal layer, LIAF can also be used with traditional artificial neural network (ANN) layers jointly. In addition, the built network can be trained with backpropagation through time (BPTT) directly, which avoids the performance loss caused by ANN to SNN conversion. Experiment results indicate that LIAF-Net achieves comparable performance to the gated recurrent unit (GRU) and long short-term memory (LSTM) on bAbI question answering (QA) tasks and achieves state-of-the-art performance on spatiotemporal dynamic vision sensor (DVS) data sets, including MNIST-DVS, CIFAR10-DVS, and DVS128 Gesture, with much less number of synaptic weights and computational overhead compared with traditional networks built by LSTM, GRU, convolutional LSTM (ConvLSTM), or 3-D convolution (Conv3D). Compared with traditional LIF-SNN, LIAF-Net also shows dramatic accuracy gain on all these experiments. In conclusion, LIAF-Net provides a framework combining the advantages of both ANNs and SNNs for lightweight and efficient spatiotemporal information processing.

摘要

基于漏电流积分和放电(LIF)模型的尖峰神经网络(SNN)已应用于节能的时间和时空处理任务。由于具有生物逼真的神经元动力学和简单性,LIF-SNN 受益于事件驱动处理,但通常面临性能降低的尴尬。这可能是因为,在 LIF-SNN 中,神经元通过尖峰传递信息。为了解决这个问题,在这项工作中,我们提出了一种漏电流积分和模拟放电(LIAF)神经元模型,以便神经元之间可以传输模拟值,并在此基础上构建了一个名为 LIAF-Net 的深度网络,用于高效的时空处理。在时域中,LIAF 遵循传统的 LIF 动力学以保持其时间处理能力。在空域中,LIAF 能够通过卷积积分或全连接积分来整合空间信息。作为时空层,LIAF 也可以与传统的人工神经网络(ANN)层联合使用。此外,所构建的网络可以直接通过时间反向传播(BPTT)进行训练,这避免了 ANN 到 SNN 转换引起的性能损失。实验结果表明,LIAF-Net 在 bAbI 问答(QA)任务上的性能可与门控循环单元(GRU)和长短期记忆(LSTM)相媲美,在时空动态视觉传感器(DVS)数据集上也达到了最新水平,包括 MNIST-DVS、CIFAR10-DVS 和 DVS128 手势,与由 LSTM、GRU、卷积 LSTM(ConvLSTM)或 3-D 卷积(Conv3D)构建的传统网络相比,突触权重和计算开销要少得多。与传统的 LIF-SNN 相比,LIAF-Net 在所有这些实验中也显示出了显著的准确性提高。总之,LIAF-Net 为轻量级和高效的时空信息处理提供了一个结合了 ANN 和 SNN 优势的框架。

相似文献

1
LIAF-Net: Leaky Integrate and Analog Fire Network for Lightweight and Efficient Spatiotemporal Information Processing.LIAF-Net:用于轻量级和高效时空信息处理的漏积分和模拟火灾网络。
IEEE Trans Neural Netw Learn Syst. 2022 Nov;33(11):6249-6262. doi: 10.1109/TNNLS.2021.3073016. Epub 2022 Oct 27.
2
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing.深度尖峰神经网络在动态视觉传感中的优化。
Neural Netw. 2021 Dec;144:686-698. doi: 10.1016/j.neunet.2021.09.022. Epub 2021 Oct 5.
3
High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron.使用量化感知训练框架和钙门控双极泄漏积分发放神经元实现高精度深度人工神经网络到脉冲神经网络的转换。
Front Neurosci. 2023 Mar 8;17:1141701. doi: 10.3389/fnins.2023.1141701. eCollection 2023.
4
An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data.一种用于尖峰神经网络的事件驱动分类器,其输入为合成或动态视觉传感器数据。
Front Neurosci. 2017 Jun 28;11:350. doi: 10.3389/fnins.2017.00350. eCollection 2017.
5
Sharing leaky-integrate-and-fire neurons for memory-efficient spiking neural networks.用于高效内存脉冲神经网络的共享泄漏积分发放神经元
Front Neurosci. 2023 Jul 31;17:1230002. doi: 10.3389/fnins.2023.1230002. eCollection 2023.
6
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
7
KLIF: An Optimized Spiking Neuron Unit for Tuning Surrogate Gradient Function.KLIF:一种用于调整替代梯度函数的优化脉冲神经元单元。
Neural Comput. 2024 Nov 19;36(12):2636-2650. doi: 10.1162/neco_a_01712.
8
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.实现基于尖峰的反向传播以训练深度神经网络架构。
Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020.
9
Comparing SNNs and RNNs on neuromorphic vision datasets: Similarities and differences.在神经形态视觉数据集上比较 SNNs 和 RNNs:相似性和差异。
Neural Netw. 2020 Dec;132:108-120. doi: 10.1016/j.neunet.2020.08.001. Epub 2020 Aug 17.
10
Electrocardiography Classification with Leaky Integrate-and-Fire Neurons in an Artificial Neural Network-Inspired Spiking Neural Network Framework.基于人工神经网络启发的尖峰神经网络框架的漏电积分和放电神经元的心电图分类。
Sensors (Basel). 2024 May 26;24(11):3426. doi: 10.3390/s24113426.

引用本文的文献

1
Biologically inspired hybrid model for Alzheimer's disease classification using structural MRI in the ADNI dataset.基于ADNI数据集的结构MRI,用于阿尔茨海默病分类的生物启发式混合模型。
Front Artif Intell. 2025 Jun 19;8:1590599. doi: 10.3389/frai.2025.1590599. eCollection 2025.
2
Fourier or Wavelet bases as counterpart self-attention in spikformer for efficient visual classification.傅里叶或小波基作为Spikformer中对应自注意力机制用于高效视觉分类。
Front Neurosci. 2025 Jan 29;18:1516868. doi: 10.3389/fnins.2024.1516868. eCollection 2024.
3
Sg-snn: a self-organizing spiking neural network based on temporal information.
Sg-snn:一种基于时间信息的自组织脉冲神经网络。
Cogn Neurodyn. 2025 Dec;19(1):14. doi: 10.1007/s11571-024-10199-6. Epub 2025 Jan 9.
4
Auto-Spikformer: Spikformer architecture search.自动Spikformer:Spikformer架构搜索
Front Neurosci. 2024 Jul 23;18:1372257. doi: 10.3389/fnins.2024.1372257. eCollection 2024.
5
Brain-inspired chaotic spiking backpropagation.受脑启发的混沌脉冲反向传播
Natl Sci Rev. 2024 Jan 30;11(6):nwae037. doi: 10.1093/nsr/nwae037. eCollection 2024 Jun.
6
Exploiting noise as a resource for computation and learning in spiking neural networks.在脉冲神经网络中,将噪声作为计算和学习的一种资源加以利用。
Patterns (N Y). 2023 Sep 4;4(10):100831. doi: 10.1016/j.patter.2023.100831. eCollection 2023 Oct 13.
7
BIDL: a brain-inspired deep learning framework for spatiotemporal processing.BIDL:一种用于时空处理的受大脑启发的深度学习框架。
Front Neurosci. 2023 Jul 26;17:1213720. doi: 10.3389/fnins.2023.1213720. eCollection 2023.
8
Spiking neural network with working memory can integrate and rectify spatiotemporal features.具有工作记忆的脉冲神经网络可以整合和校正时空特征。
Front Neurosci. 2023 Jun 14;17:1167134. doi: 10.3389/fnins.2023.1167134. eCollection 2023.
9
STSC-SNN: Spatio-Temporal Synaptic Connection with temporal convolution and attention for spiking neural networks.STSC-SNN:用于脉冲神经网络的具有时间卷积和注意力机制的时空突触连接
Front Neurosci. 2022 Dec 23;16:1079357. doi: 10.3389/fnins.2022.1079357. eCollection 2022.
10
ES-ImageNet: A Million Event-Stream Classification Dataset for Spiking Neural Networks.ES-ImageNet:用于脉冲神经网络的百万事件流分类数据集。
Front Neurosci. 2021 Nov 25;15:726582. doi: 10.3389/fnins.2021.726582. eCollection 2021.