Suppr超能文献

用于线性子空间学习的赫布型/反赫布型神经网络:基于流数据多维缩放的推导

A Hebbian/Anti-Hebbian Neural Network for Linear Subspace Learning: A Derivation from Multidimensional Scaling of Streaming Data.

作者信息

Pehlevan Cengiz, Hu Tao, Chklovskii Dmitri B

机构信息

Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, and Simons Center for Analysis, Simons Foundation, New York, NY 10010, U.S.A.

Texas A&M University, College Station, TX 77843, U.S.A.

出版信息

Neural Comput. 2015 Jul;27(7):1461-95. doi: 10.1162/NECO_a_00745. Epub 2015 May 14.

Abstract

Neural network models of early sensory processing typically reduce the dimensionality of streaming input data. Such networks learn the principal subspace, in the sense of principal component analysis, by adjusting synaptic weights according to activity-dependent learning rules. When derived from a principled cost function, these rules are nonlocal and hence biologically implausible. At the same time, biologically plausible local rules have been postulated rather than derived from a principled cost function. Here, to bridge this gap, we derive a biologically plausible network for subspace learning on streaming data by minimizing a principled cost function. In a departure from previous work, where cost was quantified by the representation, or reconstruction, error, we adopt a multidimensional scaling cost function for streaming data. The resulting algorithm relies only on biologically plausible Hebbian and anti-Hebbian local learning rules. In a stochastic setting, synaptic weights converge to a stationary state, which projects the input data onto the principal subspace. If the data are generated by a nonstationary distribution, the network can track the principal subspace. Thus, our result makes a step toward an algorithmic theory of neural computation.

摘要

早期感觉处理的神经网络模型通常会降低流式输入数据的维度。这类网络通过根据依赖于活动的学习规则调整突触权重,在主成分分析的意义上学习主性子空间。当从一个有原则的代价函数推导时,这些规则是非局部的,因此在生物学上不太合理。与此同时,生物学上合理的局部规则是被假设出来的,而不是从一个有原则的代价函数推导出来的。在这里,为了弥合这一差距,我们通过最小化一个有原则的代价函数,推导出一个用于流式数据子空间学习的生物学上合理的网络。与之前通过表示误差或重构误差来量化代价的工作不同,我们采用了一种用于流式数据的多维缩放代价函数。由此产生的算法仅依赖于生物学上合理的赫布型和反赫布型局部学习规则。在随机情况下,突触权重会收敛到一个稳态,该稳态将输入数据投影到主性子空间上。如果数据是由非平稳分布生成的,网络可以跟踪主性子空间。因此,我们的结果朝着神经计算的算法理论迈出了一步。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验