Suppr超能文献

“忘却”可增加内容可寻址存储器的存储容量。

"Unlearning" increases the storage capacity of content addressable memories.

作者信息

Kleinfeld D, Pendergraft D B

出版信息

Biophys J. 1987 Jan;51(1):47-53. doi: 10.1016/S0006-3495(87)83310-6.

Abstract

The storage and retrieval of information in networks of biological neurons can be modeled by certain types of content addressable memories (CAMs). We demonstrate numerically that the amount of information that can be stored in such CAMs is substantially increased by an unlearning algorithm. Mechanisms for the increase in capacity are identified and illustrated in terms of an energy function that describes the convergence properties of the network.

摘要

生物神经元网络中信息的存储和检索可以通过某些类型的内容可寻址存储器(CAM)来建模。我们通过数值证明,一种遗忘算法可大幅增加此类CAM中可存储的信息量。从描述网络收敛特性的能量函数角度,识别并说明了容量增加的机制。

相似文献

4
Spin glass model of learning by selection.通过选择进行学习的自旋玻璃模型。
Proc Natl Acad Sci U S A. 1986 Mar;83(6):1695-8. doi: 10.1073/pnas.83.6.1695.
5
7

本文引用的文献

2
Spin-glass models of neural networks.神经网络的自旋玻璃模型。
Phys Rev A Gen Phys. 1985 Aug;32(2):1007-1018. doi: 10.1103/physreva.32.1007.
5
The function of dream sleep.梦睡眠的功能。
Nature. 1983;304(5922):111-4. doi: 10.1038/304111a0.
8
Computing with neural circuits: a model.利用神经回路进行计算:一种模型。
Science. 1986 Aug 8;233(4764):625-33. doi: 10.1126/science.3755256.
9
A statistical theory of short and long term memory.
Behav Biol. 1975 Jun;14(2):115-33. doi: 10.1016/s0091-6773(75)90122-4.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验