Suppr超能文献

终身记忆的随机巩固。

Stochastic consolidation of lifelong memory.

机构信息

Center for Brain Science, Harvard University, Cambridge, USA.

Harvard Medical School, Boston, USA.

出版信息

Sci Rep. 2022 Jul 30;12(1):13107. doi: 10.1038/s41598-022-16407-9.

Abstract

Humans have the remarkable ability to continually store new memories, while maintaining old memories for a lifetime. How the brain avoids catastrophic forgetting of memories due to interference between encoded memories is an open problem in computational neuroscience. Here we present a model for continual learning in a recurrent neural network combining Hebbian learning, synaptic decay and a novel memory consolidation mechanism: memories undergo stochastic rehearsals with rates proportional to the memory's basin of attraction, causing self-amplified consolidation. This mechanism gives rise to memory lifetimes that extend much longer than the synaptic decay time, and retrieval probability of memories that gracefully decays with their age. The number of retrievable memories is proportional to a power of the number of neurons. Perturbations to the circuit model cause temporally-graded retrograde and anterograde deficits, mimicking observed memory impairments following neurological trauma.

摘要

人类具有不断储存新记忆的非凡能力,同时能将旧记忆保持一生。大脑如何避免由于编码记忆之间的干扰而导致记忆灾难性遗忘,这是计算神经科学中的一个开放性问题。在这里,我们提出了一种在递归神经网络中进行连续学习的模型,该模型结合了赫布学习、突触衰减和一种新的记忆巩固机制:记忆会以与记忆吸引域成比例的速率进行随机排练,导致自我放大巩固。这种机制导致记忆寿命比突触衰减时间长得多,并且记忆的检索概率会随着年龄的增长而优雅地衰减。可检索的记忆数量与神经元数量的幂成正比。对电路模型的干扰会导致时间分级的逆行和顺行缺陷,模拟了神经创伤后观察到的记忆损伤。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2dc9/9339009/00625774870a/41598_2022_16407_Fig1_HTML.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验