Suppr超能文献

用无效的赫布学习规则实现有效的神经元学习。

Effective neuronal learning with ineffective Hebbian learning rules.

作者信息

Chechik G, Meilijson I, Ruppin E

机构信息

Center for Neural Computation, Hebrew University, Jerusalem, Israel, and School of Mathematical Sciences, Tel-Aviv University, Tel Aviv, 69978, Israel.

出版信息

Neural Comput. 2001 Apr;13(4):817-40. doi: 10.1162/089976601300014367.

Abstract

In this article we revisit the classical neuroscience paradigm of Hebbian learning. We find that it is difficult to achieve effective associative memory storage by Hebbian synaptic learning, since it requires network-level information at the synaptic level or sparse coding level. Effective learning can yet be achieved even with nonsparse patterns by a neuronal process that maintains a zero sum of the incoming synaptic efficacies. This weight correction improves the memory capacity of associative networks from an essentially bounded one to a memory capacity that scales linearly with network size. It also enables the effective storage of patterns with multiple levels of activity within a single network. Such neuronal weight correction can be successfully carried out by activity-dependent homeostasis of the neuron's synaptic efficacies, which was recently observed in cortical tissue. Thus, our findings suggest that associative learning by Hebbian synaptic learning should be accompanied by continuous remodeling of neuronally driven regulatory processes in the brain.

摘要

在本文中,我们重新审视了赫布学习的经典神经科学范式。我们发现,通过赫布突触学习难以实现有效的联想记忆存储,因为它需要突触层面或稀疏编码层面的网络级信息。然而,通过一种保持传入突触效能总和为零的神经元过程,即使对于非稀疏模式也能实现有效学习。这种权重校正将联想网络的记忆容量从基本有限的容量提高到与网络大小成线性比例的记忆容量。它还能够在单个网络内有效存储具有多个活动水平的模式。这种神经元权重校正可以通过神经元突触效能的活动依赖性稳态成功实现,这一现象最近在皮质组织中被观察到。因此,我们的研究结果表明,赫布突触学习的联想学习应该伴随着大脑中神经元驱动的调节过程的持续重塑。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验