Department of Mathematics, University of Nebraska-Lincoln, Lincoln, NE 68588, USA.
Neural Comput. 2013 Jul;25(7):1891-925. doi: 10.1162/NECO_a_00459.
Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.
香农 1948 年的开创性工作产生了两个截然不同的研究领域:信息论和数学编码理论。虽然信息论对理论神经科学产生了强烈的影响,但数学编码理论的思想却受到了相当少的关注。在这里,我们从数学编码理论的角度重新审视了组合神经码,研究了常见感受野码 (RF 码) 的纠错能力。我们发现,或许令人惊讶的是,这些码中存在的高水平冗余并不能支持准确的纠错,尽管当引入小的容错时,感受野码的纠错性能赶上了随机比较码。然而,感受野码擅长反映所表示刺激之间的距离,而随机比较码则不行。我们认为,对于一种神经码来说,纠错能力的折衷可能是必要的代价,这种神经码的结构不仅服务于纠错,还必须反映刺激之间的关系。