• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用脉冲神经网络构建联想记忆系统。

Constructing an Associative Memory System Using Spiking Neural Network.

作者信息

He Hu, Shang Yingjie, Yang Xu, Di Yingze, Lin Jiajun, Zhu Yimeng, Zheng Wenhao, Zhao Jinfeng, Ji Mengyao, Dong Liya, Deng Ning, Lei Yunlin, Chai Zenghao

机构信息

Institute of Microelectronics, Tsinghua University, Beijing, China.

School of Computer Science and Technology, Beijing Institute of Technology, Beijing, China.

出版信息

Front Neurosci. 2019 Jul 3;13:650. doi: 10.3389/fnins.2019.00650. eCollection 2019.

DOI:10.3389/fnins.2019.00650
PMID:31333397
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6615473/
Abstract

Development of computer science has led to the blooming of artificial intelligence (AI), and neural networks are the core of AI research. Although mainstream neural networks have done well in the fields of image processing and speech recognition, they do not perform well in models aimed at understanding contextual information. In our opinion, the reason for this is that the essence of building a neural network through parameter training is to fit the data to the statistical law through parameter training. Since the neural network built using this approach does not possess memory ability, it cannot reflect the relationship between data with respect to the causality. Biological memory is fundamentally different from the current mainstream digital memory in terms of the storage method. The information stored in digital memory is converted to binary code and written in separate storage units. This physical isolation destroys the correlation of information. Therefore, the information stored in digital memory does not have the recall or association functions of biological memory which can present causality. In this paper, we present the results of our preliminary effort at constructing an associative memory system based on a spiking neural network. We broke the neural network building process into two phases: the Structure Formation Phase and the Parameter Training Phase. The Structure Formation Phase applies a learning method based on Hebb's rule to provoke neurons in the memory layer growing new synapses to connect to neighbor neurons as a response to the specific input spiking sequences fed to the neural network. The aim of this phase is to train the neural network to memorize the specific input spiking sequences. During the Parameter Training Phase, STDP and reinforcement learning are employed to optimize the weight of synapses and thus to find a way to let the neural network recall the memorized specific input spiking sequences. The results show that our memory neural network could memorize different targets and could recall the images it had memorized.

摘要

计算机科学的发展促使人工智能(AI)蓬勃兴起,而神经网络是人工智能研究的核心。尽管主流神经网络在图像处理和语音识别领域表现出色,但在旨在理解上下文信息的模型中却表现不佳。我们认为,造成这种情况的原因是,通过参数训练构建神经网络的本质是通过参数训练使数据符合统计规律。由于使用这种方法构建的神经网络不具备记忆能力,所以它无法反映数据之间因果关系。生物记忆在存储方式上与当前主流的数字记忆有着根本的不同。存储在数字记忆中的信息被转换为二进制代码,并写入单独的存储单元。这种物理隔离破坏了信息的相关性。因此,存储在数字记忆中的信息不具备生物记忆那种能够呈现因果关系的回忆或关联功能。在本文中,我们展示了基于脉冲神经网络构建联想记忆系统的初步成果。我们将神经网络的构建过程分为两个阶段:结构形成阶段和参数训练阶段。结构形成阶段应用基于赫布法则的学习方法,促使记忆层中的神经元生长出新的突触,以连接相邻神经元,作为对输入到神经网络的特定脉冲序列的响应。此阶段的目的是训练神经网络记住特定的输入脉冲序列。在参数训练阶段,采用突触时间依赖性可塑性(STDP)和强化学习来优化突触权重,从而找到让神经网络回忆起已记住的特定输入脉冲序列的方法。结果表明,我们的记忆神经网络能够记住不同的目标,并能回忆起它所记住的图像。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/e38169329137/fnins-13-00650-g0014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/9600305bc3f9/fnins-13-00650-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/78e95395ab9a/fnins-13-00650-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/8811f9fe8b28/fnins-13-00650-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/cfbbaf8f8550/fnins-13-00650-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/5faa8d3bf197/fnins-13-00650-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/0e2f00354c2c/fnins-13-00650-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/61e4babdc66f/fnins-13-00650-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/1d43c0e40616/fnins-13-00650-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/e4c1f6192a5e/fnins-13-00650-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/34b8cb54d7fb/fnins-13-00650-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/56270d512ce7/fnins-13-00650-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/7b518143a050/fnins-13-00650-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/5acd95445af8/fnins-13-00650-g0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/e38169329137/fnins-13-00650-g0014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/9600305bc3f9/fnins-13-00650-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/78e95395ab9a/fnins-13-00650-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/8811f9fe8b28/fnins-13-00650-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/cfbbaf8f8550/fnins-13-00650-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/5faa8d3bf197/fnins-13-00650-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/0e2f00354c2c/fnins-13-00650-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/61e4babdc66f/fnins-13-00650-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/1d43c0e40616/fnins-13-00650-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/e4c1f6192a5e/fnins-13-00650-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/34b8cb54d7fb/fnins-13-00650-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/56270d512ce7/fnins-13-00650-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/7b518143a050/fnins-13-00650-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/5acd95445af8/fnins-13-00650-g0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1306/6615473/e38169329137/fnins-13-00650-g0014.jpg

相似文献

1
Constructing an Associative Memory System Using Spiking Neural Network.使用脉冲神经网络构建联想记忆系统。
Front Neurosci. 2019 Jul 3;13:650. doi: 10.3389/fnins.2019.00650. eCollection 2019.
2
Categorization and decision-making in a neurobiologically plausible spiking network using a STDP-like learning rule.使用类似于 STDP 的学习规则在具有神经生物学合理性的尖峰网络中进行分类和决策。
Neural Netw. 2013 Dec;48:109-24. doi: 10.1016/j.neunet.2013.07.012. Epub 2013 Aug 14.
3
A forecast-based STDP rule suitable for neuromorphic implementation.一种适用于神经形态实现的基于预测的 STDP 规则。
Neural Netw. 2012 Aug;32:3-14. doi: 10.1016/j.neunet.2012.02.018. Epub 2012 Feb 14.
4
Spiking neural network model for memorizing sequences with forward and backward recall.用于通过向前和向后回忆来记忆序列的脉冲神经网络模型。
Biosystems. 2013 Jun;112(3):214-23. doi: 10.1016/j.biosystems.2013.03.018. Epub 2013 Apr 2.
5
Forced phase-locked states and information retrieval in a two-layer network of oscillatory neurons with directional connectivity.具有定向连接性的振荡神经元双层网络中的强迫锁相状态与信息检索
Phys Rev E Stat Nonlin Soft Matter Phys. 2007 Sep;76(3 Pt 1):031912. doi: 10.1103/PhysRevE.76.031912. Epub 2007 Sep 12.
6
Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition.用于在线时空谱模式识别的动态进化尖峰神经网络。
Neural Netw. 2013 May;41:188-201. doi: 10.1016/j.neunet.2012.11.014. Epub 2012 Dec 20.
7
Sudoku associative memory.数独联想记忆。
Neural Netw. 2014 Sep;57:112-27. doi: 10.1016/j.neunet.2014.05.023. Epub 2014 Jun 10.
8
Memristors for Neuromorphic Circuits and Artificial Intelligence Applications.用于神经形态电路和人工智能应用的忆阻器
Materials (Basel). 2020 Feb 20;13(4):938. doi: 10.3390/ma13040938.
9
MONETA: A Processing-In-Memory-Based Hardware Platform for the Hybrid Convolutional Spiking Neural Network With Online Learning.MONETA:一种用于具有在线学习功能的混合卷积脉冲神经网络的基于内存处理的硬件平台。
Front Neurosci. 2022 Apr 11;16:775457. doi: 10.3389/fnins.2022.775457. eCollection 2022.
10
STDP provides the substrate for igniting synfire chains by spatiotemporal input patterns.突触时间依赖性可塑性通过时空输入模式为点燃同步放电链提供了基础。
Neural Comput. 2008 Feb;20(2):415-35. doi: 10.1162/neco.2007.11-05-043.

引用本文的文献

1
Biologically Inspired Spatial-Temporal Perceiving Strategies for Spiking Neural Network.用于脉冲神经网络的受生物启发的时空感知策略
Biomimetics (Basel). 2025 Jan 14;10(1):48. doi: 10.3390/biomimetics10010048.
2
Spiking neural networks for physiological and speech signals: a review.用于生理和语音信号的脉冲神经网络综述
Biomed Eng Lett. 2024 Jun 25;14(5):943-954. doi: 10.1007/s13534-024-00404-0. eCollection 2024 Sep.
3
Circuit-based neuromodulation enhances delayed recall in amnestic mild cognitive impairment.基于电路的神经调节增强遗忘型轻度认知障碍的延迟回忆。

本文引用的文献

1
Implementing artificial neural networks through bionic construction.通过仿生构建来实现人工神经网络。
PLoS One. 2019 Feb 22;14(2):e0212368. doi: 10.1371/journal.pone.0212368. eCollection 2019.
2
Mapping, Learning, Visualization, Classification, and Understanding of fMRI Data in the NeuCube Evolving Spatiotemporal Data Machine of Spiking Neural Networks.在神经立方进化时空数据机器的尖峰神经网络中对 fMRI 数据进行映射、学习、可视化、分类和理解。
IEEE Trans Neural Netw Learn Syst. 2017 Apr;28(4):887-899. doi: 10.1109/TNNLS.2016.2612890. Epub 2016 Oct 6.
3
Deep learning.
J Neurol Neurosurg Psychiatry. 2024 Sep 17;95(10):902-911. doi: 10.1136/jnnp-2023-333152.
4
Anti-Disturbance of Scale-Free Spiking Neural Network against Impulse Noise.无标度脉冲神经网络抗脉冲噪声干扰
Brain Sci. 2023 May 22;13(5):837. doi: 10.3390/brainsci13050837.
5
Pseudo-transistors for emerging neuromorphic electronics.用于新兴神经形态电子学的伪晶体管。
Sci Technol Adv Mater. 2023 Mar 20;24(1):2180286. doi: 10.1080/14686996.2023.2180286. eCollection 2023.
6
Entropic associative memory for manuscript symbols.手稿符号的熵联想记忆。
PLoS One. 2022 Aug 4;17(8):e0272386. doi: 10.1371/journal.pone.0272386. eCollection 2022.
7
Developing Intelligent Robots that Grasp Affordance.开发能够理解可供性的智能机器人。
Front Robot AI. 2022 Jul 5;9:951293. doi: 10.3389/frobt.2022.951293. eCollection 2022.
8
Spatio-Temporal Sequential Memory Model With Mini-Column Neural Network.基于微柱神经网络的时空序列记忆模型
Front Neurosci. 2021 May 28;15:650430. doi: 10.3389/fnins.2021.650430. eCollection 2021.
9
An entropic associative memory.一种基于熵的联想记忆。
Sci Rep. 2021 Mar 25;11(1):6948. doi: 10.1038/s41598-021-86270-7.
10
How Neuronal Noises Influence the Spiking Neural Networks's Cognitive Learning Process: A Preliminary Study.神经元噪声如何影响脉冲神经网络的认知学习过程:一项初步研究。
Brain Sci. 2021 Jan 25;11(2):153. doi: 10.3390/brainsci11020153.
深度学习。
Nature. 2015 May 28;521(7553):436-44. doi: 10.1038/nature14539.
4
Deep learning in neural networks: an overview.神经网络中的深度学习:综述。
Neural Netw. 2015 Jan;61:85-117. doi: 10.1016/j.neunet.2014.09.003. Epub 2014 Oct 13.
5
NeuCube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data.神经立方:一种用于映射、学习和理解时空脑数据的脉冲神经网络架构。
Neural Netw. 2014 Apr;52:62-76. doi: 10.1016/j.neunet.2014.01.006. Epub 2014 Jan 20.
6
Representation learning: a review and new perspectives.表示学习:综述与新视角。
IEEE Trans Pattern Anal Mach Intell. 2013 Aug;35(8):1798-828. doi: 10.1109/TPAMI.2013.50.
7
Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition.用于在线时空谱模式识别的动态进化尖峰神经网络。
Neural Netw. 2013 May;41:188-201. doi: 10.1016/j.neunet.2012.11.014. Epub 2012 Dec 20.
8
SWAT: a spiking neural network training algorithm for classification problems.SWAT:一种用于分类问题的脉冲神经网络训练算法。
IEEE Trans Neural Netw. 2010 Nov;21(11):1817-30. doi: 10.1109/TNN.2010.2074212. Epub 2010 Sep 27.
9
Structural change and development in real and artificial neural networks.
Neural Netw. 1998 Jun;11(4):577-599. doi: 10.1016/s0893-6080(98)00033-1.
10
Long short-term memory.长短期记忆
Neural Comput. 1997 Nov 15;9(8):1735-80. doi: 10.1162/neco.1997.9.8.1735.