Computer Science Department, University at Albany, State University of New York, Albany, NY 12222, USA.
Neural Comput. 2011 Nov;23(11):2942-73. doi: 10.1162/NECO_a_00197. Epub 2011 Aug 18.
Efficient coding transforms that reduce or remove statistical dependencies in natural sensory signals are important for both biology and engineering. In recent years, divisive normalization (DN) has been advocated as a simple and effective nonlinear efficient coding transform. In this work, we first elaborate on the theoretical justification for DN as an efficient coding transform. Specifically, we use the multivariate t model to represent several important statistical properties of natural sensory signals and show that DN approximates the optimal transforms that eliminate statistical dependencies in the multivariate t model. Second, we show that several forms of DN used in the literature are equivalent in their effects as efficient coding transforms. Third, we provide a quantitative evaluation of the overall dependency reduction performance of DN for both the multivariate t models and natural sensory signals. Finally, we find that statistical dependencies in the multivariate t model and natural sensory signals are increased by the DN transform with low-input dimensions. This implies that for DN to be an effective efficient coding transform, it has to pool over a sufficiently large number of inputs.
高效编码变换可以减少或消除自然感觉信号中的统计相关性,这对生物学和工程学都很重要。近年来,有理论主张,除法归一化(DN)是一种简单而有效的非线性高效编码变换。在这项工作中,我们首先详细阐述了 DN 作为一种高效编码变换的理论依据。具体来说,我们使用多元 t 模型来表示自然感觉信号的几个重要统计特性,并表明 DN 近似于最优变换,可以消除多元 t 模型中的统计相关性。其次,我们表明,文献中使用的几种形式的 DN 在作为高效编码变换的效果上是等效的。第三,我们对 DN 对多元 t 模型和自然感觉信号的整体依赖性降低性能进行了定量评估。最后,我们发现,多元 t 模型和自然感觉信号中的统计相关性会随着低输入维度的 DN 变换而增加。这意味着,为了使 DN 成为有效的高效编码变换,它必须在足够多的输入上进行池化。