Suppr超能文献

Recurrent Information Optimization with Local, Metaplastic Synaptic Dynamics.

作者信息

Liu Sensen, Ching ShiNung

机构信息

Department of Electrical and Systems Engineering, Washington University, St. Louis, MO 63130, U.S.A.

出版信息

Neural Comput. 2017 Sep;29(9):2528-2552. doi: 10.1162/NECO_a_00993. Epub 2017 Jun 9.

Abstract

We consider the problem of optimizing information-theoretic quantities in recurrent networks via synaptic learning. In contrast to feedforward networks, the recurrence presents a key challenge insofar as an optimal learning rule must aggregate the joint distribution of the whole network. This challenge, in particular, makes a local policy (i.e., one that depends on only pairwise interactions) difficult. Here, we report a local metaplastic learning rule that performs approximate optimization by estimating whole-network statistics through the use of several slow, nested dynamical variables. These dynamics provide the rule with both anti-Hebbian and Hebbian components, thus allowing for decorrelating and correlating learning regimes that can occur when either is favorable for optimality. We demonstrate the performance of the synthesized rule in comparison to classical BCM dynamics and use the networks to conduct history-dependent tasks that highlight the advantages of recurrence. Finally, we show the consistency of the resultant learned networks with notions of criticality, including balanced ratios of excitation and inhibition.

摘要

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验