Hertz J, Prügel-Bennett A
Nordita, Copenhagen, Denmark.
Int J Neural Syst. 1996 Sep;7(4):445-50. doi: 10.1142/s0129065796000427.
We develop a model of cortical coding of stimuli by the sequences of activation patterns that they ignite in an initially random network. Hebbian learning then stabilizes these sequences, making them attractors of the dynamics. There is a competition between the capacity of the network and the stability of the sequences; for small stability parameter epsilon (the strength of the mean stabilizing PSP in the neurons in a learned sequence) the capacity is proportional to 1/epsilon 2. For epsilon of the order of or less than the PSPs of the untrained network, the capacity exceeds that for sequences learned from tabula rasa.
我们通过刺激在初始随机网络中引发的激活模式序列,构建了一种皮层刺激编码模型。赫布学习随后使这些序列稳定下来,使其成为动力学的吸引子。网络容量与序列稳定性之间存在竞争;对于较小的稳定性参数ε(学习序列中神经元平均稳定 PSP 的强度),容量与 1/ε² 成正比。当ε的量级小于或等于未训练网络的 PSP 时,该容量超过从白板状态学习的序列的容量。