Paninski Liam
Gatsby Computational Neuroscience Unit, University College London, London, WC1N 3AR, UK.
Neural Comput. 2005 Jul;17(7):1480-507. doi: 10.1162/0899766053723032.
We discuss an idea for collecting data in a relatively efficient manner. Our point of view is Bayesian and information-theoretic: on any given trial, we want to adaptively choose the input in such a way that the mutual information between the (unknown) state of the system and the (stochastic) output is maximal, given any prior information (including data collected on any previous trials). We prove a theorem that quantifies the effectiveness of this strategy and give a few illustrative examples comparing the performance of this adaptive technique to that of the more usual nonadaptive experimental design. In particular, we calculate the asymptotic efficiency of the information-maximization strategy and demonstrate that this method is in a well-defined sense never less efficient--and is generically more efficient--than the nonadaptive strategy. For example, we are able to explicitly calculate the asymptotic relative efficiency of the staircase method widely employed in psychophysics research and to demonstrate the dependence of this efficiency on the form of the psychometric function underlying the output responses.
我们讨论了一种以相对高效的方式收集数据的想法。我们的观点是贝叶斯和信息论的:在任何给定的试验中,我们希望根据任何先验信息(包括之前任何试验中收集的数据),以这样一种方式自适应地选择输入,即系统(未知)状态与(随机)输出之间的互信息最大。我们证明了一个定理,该定理量化了这种策略的有效性,并给出了一些示例,将这种自适应技术的性能与更常见的非自适应实验设计的性能进行比较。特别是,我们计算了信息最大化策略的渐近效率,并证明在明确的意义上,这种方法的效率从不低于非自适应策略,而且通常更高效。例如,我们能够明确计算心理物理学研究中广泛使用的阶梯法的渐近相对效率,并证明这种效率对输出响应所基于的心理测量函数形式的依赖性。