Suppr超能文献

从神经网络的连接中重建存储在其中的记忆的贝叶斯方法。

Bayesian reconstruction of memories stored in neural networks from their connectivity.

机构信息

International School of Advanced Studies (SISSA), Trieste, Italy.

IdePHICS laboratory, Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland.

出版信息

PLoS Comput Biol. 2023 Jan 30;19(1):e1010813. doi: 10.1371/journal.pcbi.1010813. eCollection 2023 Jan.

Abstract

The advent of comprehensive synaptic wiring diagrams of large neural circuits has created the field of connectomics and given rise to a number of open research questions. One such question is whether it is possible to reconstruct the information stored in a recurrent network of neurons, given its synaptic connectivity matrix. Here, we address this question by determining when solving such an inference problem is theoretically possible in specific attractor network models and by providing a practical algorithm to do so. The algorithm builds on ideas from statistical physics to perform approximate Bayesian inference and is amenable to exact analysis. We study its performance on three different models, compare the algorithm to standard algorithms such as PCA, and explore the limitations of reconstructing stored patterns from synaptic connectivity.

摘要

综合大型神经网络突触连接图的出现创造了连接组学领域,并提出了许多开放性研究问题。其中一个问题是,给定神经元的突触连接矩阵,是否有可能重建储存在递归神经网络中的信息。在这里,我们通过确定在特定吸引子网络模型中解决此类推理问题在理论上是否可行,并提供一种实用的算法来解决这个问题。该算法基于统计物理的思想来进行近似贝叶斯推断,并且可以进行精确分析。我们在三个不同的模型上研究了它的性能,将该算法与 PCA 等标准算法进行了比较,并探讨了从突触连接重建存储模式的局限性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验