Limbacher Thomas, Ozdenizci Ozan, Legenstein Robert
IEEE Trans Neural Netw Learn Syst. 2025 Feb;36(2):2551-2562. doi: 10.1109/TNNLS.2023.3341446. Epub 2025 Feb 6.
Spiking neural networks (SNNs) are the basis for many energy-efficient neuromorphic hardware systems. While there has been substantial progress in SNN research, artificial SNNs still lack many capabilities of their biological counterparts. In biological neural systems, memory is a key component that enables the retention of information over a huge range of temporal scales, ranging from hundreds of milliseconds up to years. While Hebbian plasticity is believed to play a pivotal role in biological memory, it has so far been analyzed mostly in the context of pattern completion and unsupervised learning in artificial and SNNs. Here, we propose that Hebbian plasticity is fundamental for computations in biological and artificial spiking neural systems. We introduce a novel memory-augmented SNN architecture that is enriched by Hebbian synaptic plasticity. We show that Hebbian enrichment renders SNNs surprisingly versatile in terms of their computational as well as learning capabilities. It improves their abilities for out-of-distribution generalization, one-shot learning, cross-modal generative association, language processing, and reward-based learning. This suggests that powerful cognitive neuromorphic systems can be built based on this principle.
脉冲神经网络(SNN)是许多节能神经形态硬件系统的基础。虽然SNN研究取得了重大进展,但人工SNN仍然缺乏其生物对应物的许多能力。在生物神经系统中,记忆是一个关键组成部分,它能够在从数百毫秒到数年的巨大时间尺度上保留信息。虽然赫布可塑性被认为在生物记忆中起关键作用,但到目前为止,它主要是在人工和SNN的模式完成和无监督学习的背景下进行分析的。在这里,我们提出赫布可塑性是生物和人工脉冲神经网络系统计算的基础。我们引入了一种新颖的记忆增强SNN架构,该架构通过赫布突触可塑性得到丰富。我们表明,赫布增强使SNN在计算和学习能力方面具有惊人的通用性。它提高了它们的分布外泛化、一次性学习、跨模态生成关联、语言处理和基于奖励的学习能力。这表明基于这一原理可以构建强大的认知神经形态系统。