Suppr超能文献

吸引子网络中的记忆动态

Memory dynamics in attractor networks.

作者信息

Li Guoqi, Ramanathan Kiruthika, Ning Ning, Shi Luping, Wen Changyun

机构信息

Centre for Brain Inspired Computing Research (CBICR), Department of Precision Instrument, Tsinghua University, Beijing 100084, China.

Department of Advanced Concepts and Nanotechnology (ACN), Data Storage Institute, A∗STAR, 5 Engineer Drive 1, Singapore 117608.

出版信息

Comput Intell Neurosci. 2015;2015:191745. doi: 10.1155/2015/191745. Epub 2015 Apr 19.

Abstract

As can be represented by neurons and their synaptic connections, attractor networks are widely believed to underlie biological memory systems and have been used extensively in recent years to model the storage and retrieval process of memory. In this paper, we propose a new energy function, which is nonnegative and attains zero values only at the desired memory patterns. An attractor network is designed based on the proposed energy function. It is shown that the desired memory patterns are stored as the stable equilibrium points of the attractor network. To retrieve a memory pattern, an initial stimulus input is presented to the network, and its states converge to one of stable equilibrium points. Consequently, the existence of the spurious points, that is, local maxima, saddle points, or other local minima which are undesired memory patterns, can be avoided. The simulation results show the effectiveness of the proposed method.

摘要

正如神经元及其突触连接所表现的那样,吸引子网络被广泛认为是生物记忆系统的基础,并且近年来已被广泛用于对记忆的存储和检索过程进行建模。在本文中,我们提出了一种新的能量函数,它是非负的,并且仅在期望的记忆模式下达到零值。基于所提出的能量函数设计了一个吸引子网络。结果表明,期望的记忆模式被存储为吸引子网络的稳定平衡点。为了检索记忆模式,向网络呈现初始刺激输入,其状态收敛到稳定平衡点之一。因此,可以避免虚假点的存在,即局部最大值、鞍点或其他不期望的记忆模式的局部最小值。仿真结果表明了该方法的有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/184c/4417571/fea0159344f1/CIN2015-191745.001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验