Suppr超能文献

校正尖峰序列信息测度中的采样偏差问题。

Correcting for the sampling bias problem in spike train information measures.

作者信息

Panzeri Stefano, Senatore Riccardo, Montemurro Marcelo A, Petersen Rasmus S

机构信息

University of Manchester, Faculty of Life Sciences, The Mill, PO Box 88, Manchester M60 1QD,UK.

出版信息

J Neurophysiol. 2007 Sep;98(3):1064-72. doi: 10.1152/jn.00559.2007. Epub 2007 Jul 5.

Abstract

Information Theory enables the quantification of how much information a neuronal response carries about external stimuli and is hence a natural analytic framework for studying neural coding. The main difficulty in its practical application to spike train analysis is that estimates of neuronal information from experimental data are prone to a systematic error (called "bias"). This bias is an inevitable consequence of the limited number of stimulus-response samples that it is possible to record in a real experiment. In this paper, we first explain the origin and the implications of the bias problem in spike train analysis. We then review and evaluate some recent general-purpose methods to correct for sampling bias: the Panzeri-Treves, Quadratic Extrapolation, Best Universal Bound, Nemenman-Shafee-Bialek procedures, and a recently proposed shuffling bias reduction procedure. Finally, we make practical recommendations for the accurate computation of information from spike trains. Our main recommendation is to estimate information using the shuffling bias reduction procedure in combination with one of the other four general purpose bias reduction procedures mentioned in the preceding text. This provides information estimates with acceptable variance and which are unbiased even when the number of trials per stimulus is as small as the number of possible discrete neuronal responses.

摘要

信息论能够量化神经元反应携带的关于外部刺激的信息量,因此是研究神经编码的一个自然分析框架。将其实际应用于尖峰序列分析的主要困难在于,从实验数据估计神经元信息容易出现系统误差(称为“偏差”)。这种偏差是实际实验中可记录的刺激-反应样本数量有限的必然结果。在本文中,我们首先解释尖峰序列分析中偏差问题的起源及其影响。然后,我们回顾并评估一些最近用于校正采样偏差的通用方法:潘泽里-特雷维斯法、二次外推法、最佳通用界法、内门曼-沙菲-比亚莱克法,以及最近提出的洗牌偏差减少法。最后,我们对从尖峰序列准确计算信息提出实用建议。我们的主要建议是结合前文提到的其他四种通用偏差减少方法之一,使用洗牌偏差减少法来估计信息。这提供了具有可接受方差的信息估计,并且即使每个刺激的试验次数与可能的离散神经元反应数量一样少,这些估计也是无偏的。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验