Suppr超能文献

学习在循环脉冲神经网络中结合赫布型和非赫布型可塑性来生成序列

Learning to Generate Sequences with Combination of Hebbian and Non-hebbian Plasticity in Recurrent Spiking Neural Networks.

作者信息

Panda Priyadarshini, Roy Kaushik

机构信息

Nanoelectronics Reserach Laboratory, Purdue Univerisity, School of Electrical and Computer Engineering, West Lafayette, IN, United States.

出版信息

Front Neurosci. 2017 Dec 12;11:693. doi: 10.3389/fnins.2017.00693. eCollection 2017.

Abstract

Synaptic Plasticity, the foundation for learning and memory formation in the human brain, manifests in various forms. Here, we combine the standard spike timing correlation based Hebbian plasticity with a non-Hebbian synaptic decay mechanism for training a recurrent spiking neural model to generate sequences. We show that inclusion of the adaptive decay of synaptic weights with standard STDP helps learn stable contextual dependencies between temporal sequences, while reducing the strong attractor states that emerge in recurrent models due to feedback loops. Furthermore, we show that the combined learning scheme suppresses the chaotic activity in the recurrent model substantially, thereby enhancing its' ability to generate sequences consistently even in the presence of perturbations.

摘要

突触可塑性是人类大脑学习和记忆形成的基础,有多种表现形式。在此,我们将基于标准脉冲时间相关性的赫布可塑性与一种非赫布突触衰减机制相结合,用于训练循环脉冲神经模型以生成序列。我们表明,将突触权重的自适应衰减与标准的尖峰时间依赖性可塑性(STDP)相结合,有助于学习时间序列之间稳定的上下文依赖性,同时减少由于反馈回路在循环模型中出现的强吸引子状态。此外,我们表明,这种组合学习方案大大抑制了循环模型中的混沌活动,从而增强了其即使在存在扰动的情况下也能持续生成序列的能力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/944f/5733011/7d02dbaca6e0/fnins-11-00693-g0001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验