Department of Electrical and Systems Engineering, Washington University in St. Louis, One Brookings Drive, Campus Box 1042, MO 63130, United States; Department of Neurobiology, Harvard Medical School, 220 Longwood Ave, Boston, MA 02115, United States.
Department of Electrical and Systems Engineering, Washington University in St. Louis, One Brookings Drive, Campus Box 1042, MO 63130, United States; Division of Biology and Biomedical Sciences, Washington University in St. Louis, One Brookings Drive, Campus Box 1042, MO 63130, United States.
Neural Netw. 2017 Oct;94:212-219. doi: 10.1016/j.neunet.2017.07.008. Epub 2017 Jul 22.
A long-standing and influential hypothesis in neural information processing is that early sensory networks adapt themselves to produce efficient codes of afferent inputs. Here, we show how a nonlinear recurrent network provides an optimal solution for the efficient coding of an afferent input and its history. We specifically consider the problem of producing lightweight codes, ones that minimize both ℓ and ℓ constraints on sparsity and energy, respectively. When embedded in a linear coding paradigm, this problem results in a non-smooth convex optimization problem. We employ a proximal gradient descent technique to develop the solution, showing that the optimal code is realized through a recurrent network endowed with a nonlinear soft thresholding operator. The training of the network connection weights is readily achieved through gradient-based local learning. If such learning is assumed to occur on a slower time-scale than the (faster) recurrent dynamics, then the network as a whole converges to an optimal set of codes and weights via what is, in effect, an alternative minimization procedure. Our results show how the addition of thresholding nonlinearities to a recurrent network may enable the production of lightweight, history-sensitive encoding schemes.
在神经信息处理中,一个长期存在且有影响力的假说认为,早期的感觉网络会自适应地产生有效的传入输入编码。在这里,我们展示了一个非线性递归网络如何为传入输入及其历史提供有效的编码的最优解决方案。我们特别考虑了生成轻量级代码的问题,这些代码分别将稀疏性和能量的 ℓ 和 ℓ 约束最小化。当嵌入在线性编码范例中时,该问题导致非平滑凸优化问题。我们采用近端梯度下降技术来开发解决方案,结果表明,最优代码是通过具有非线性软阈值算子的递归网络实现的。通过基于梯度的局部学习,可以轻松地实现网络连接权重的训练。如果假设这种学习比(更快的)递归动力学发生得更慢,那么整个网络将通过一种有效的替代最小化过程,收敛到最优的代码和权重集。我们的结果表明,向递归网络添加阈值非线性可以实现轻量级、敏感历史的编码方案。