Suppr超能文献

侧连接提高简单神经网络学习的泛化能力。

Lateral Connections Improve Generalizability of Learning in a Simple Neural Network.

机构信息

Department of Pharmacology, University of Maryland, Baltimore, MD 21201, U.S.A.

出版信息

Neural Comput. 2024 Mar 21;36(4):705-717. doi: 10.1162/neco_a_01640.

Abstract

To navigate the world around us, neural circuits rapidly adapt to their environment learning generalizable strategies to decode information. When modeling these learning strategies, network models find the optimal solution to satisfy one task condition but fail when introduced to a novel task or even a different stimulus in the same space. In the experiments described in this letter, I investigate the role of lateral gap junctions in learning generalizable strategies to process information. Lateral gap junctions are formed by connexin proteins creating an open pore that allows for direct electrical signaling between two neurons. During neural development, the rate of gap junctions is high, and daughter cells that share similar tuning properties are more likely to be connected by these junctions. Gap junctions are highly plastic and get heavily pruned throughout development. I hypothesize that they mediate generalized learning by imprinting the weighting structure within a layer to avoid overfitting to one task condition. To test this hypothesis, I implemented a feedforward probabilistic neural network mimicking a cortical fast spiking neuron circuit that is heavily involved in movement. Many of these cells are tuned to speeds that I used as the input stimulus for the network to estimate. When training this network using a delta learning rule, both a laterally connected network and an unconnected network can estimate a single speed. However, when asking the network to estimate two or more speeds, alternated in training, an unconnected network either cannot learn speed or optimizes to a singular speed, while the laterally connected network learns the generalizable strategy and can estimate both speeds. These results suggest that lateral gap junctions between neurons enable generalized learning, which may help explain learning differences across life span.

摘要

为了在我们周围的世界中导航,神经回路会迅速适应其环境,学习可泛化的策略来解码信息。在对这些学习策略建模时,网络模型找到了满足一个任务条件的最优解决方案,但当引入新任务甚至是同一空间中的不同刺激时,模型就会失败。在本函中描述的实验中,我研究了横向间隙连接在学习可泛化的信息处理策略中的作用。横向间隙连接由连接蛋白形成,形成一个允许两个神经元之间直接电信号传递的开放孔。在神经发育过程中,间隙连接的速度很高,具有相似调谐特性的子细胞更有可能通过这些连接连接在一起。间隙连接具有高度的可塑性,在整个发育过程中会大量修剪。我假设它们通过在层内印上权重结构来介导泛化学习,以避免对一个任务条件过度拟合。为了验证这一假设,我实现了一个前馈概率神经网络,模拟了一个快速皮层尖峰神经元电路,该电路在运动中起着重要作用。这些细胞中的许多都被调谐到我用作网络输入刺激的速度。当使用 Delta 学习规则对该网络进行训练时,连接的网络和未连接的网络都可以估计单个速度。但是,当要求网络估计两个或更多速度,交替进行训练时,未连接的网络要么无法学习速度,要么优化到一个单一的速度,而连接的网络学习了通用策略并可以估计两个速度。这些结果表明,神经元之间的横向间隙连接允许进行泛化学习,这可能有助于解释整个生命周期中的学习差异。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验