Suppr超能文献

从神经网络的连接中重建存储在其中的记忆的贝叶斯方法。

Bayesian reconstruction of memories stored in neural networks from their connectivity.

机构信息

International School of Advanced Studies (SISSA), Trieste, Italy.

IdePHICS laboratory, Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland.

出版信息

PLoS Comput Biol. 2023 Jan 30;19(1):e1010813. doi: 10.1371/journal.pcbi.1010813. eCollection 2023 Jan.

Abstract

The advent of comprehensive synaptic wiring diagrams of large neural circuits has created the field of connectomics and given rise to a number of open research questions. One such question is whether it is possible to reconstruct the information stored in a recurrent network of neurons, given its synaptic connectivity matrix. Here, we address this question by determining when solving such an inference problem is theoretically possible in specific attractor network models and by providing a practical algorithm to do so. The algorithm builds on ideas from statistical physics to perform approximate Bayesian inference and is amenable to exact analysis. We study its performance on three different models, compare the algorithm to standard algorithms such as PCA, and explore the limitations of reconstructing stored patterns from synaptic connectivity.

摘要

综合大型神经网络突触连接图的出现创造了连接组学领域,并提出了许多开放性研究问题。其中一个问题是,给定神经元的突触连接矩阵,是否有可能重建储存在递归神经网络中的信息。在这里,我们通过确定在特定吸引子网络模型中解决此类推理问题在理论上是否可行,并提供一种实用的算法来解决这个问题。该算法基于统计物理的思想来进行近似贝叶斯推断,并且可以进行精确分析。我们在三个不同的模型上研究了它的性能,将该算法与 PCA 等标准算法进行了比较,并探讨了从突触连接重建存储模式的局限性。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验