Howard Hughes Medical Institute, Janelia Farm Research Campus, Ashburn, VA 20147, USA.
Neural Comput. 2012 Nov;24(11):2852-72. doi: 10.1162/NECO_a_00353. Epub 2012 Aug 24.
Computing sparse redundant representations is an important problem in both applied mathematics and neuroscience. In many applications, this problem must be solved in an energy-efficient way. Here, we propose a hybrid distributed algorithm (HDA), which solves this problem on a network of simple nodes communicating by low-bandwidth channels. HDA nodes perform both gradient-descent-like steps on analog internal variables and coordinate-descent-like steps via quantized external variables communicated to each other. Interestingly, the operation is equivalent to a network of integrate-and-fire neurons, suggesting that HDA may serve as a model of neural computation. We show that the numerical performance of HDA is on par with existing algorithms. In the asymptotic regime, the representation error of HDA decays with time, t, as 1/t. HDA is stable against time-varying noise; specifically, the representation error decays as 1/√t for gaussian white noise.
计算稀疏冗余表示是应用数学和神经科学中的一个重要问题。在许多应用中,这个问题必须以节能的方式解决。在这里,我们提出了一种混合分布式算法(HDA),它在通过低带宽通道通信的简单节点网络上解决这个问题。HDA 节点在模拟内部变量上执行类似于梯度下降的步骤,并通过量化的外部变量进行类似于坐标下降的步骤,这些变量相互传递。有趣的是,这个操作相当于一个由积分和点火神经元组成的网络,这表明 HDA 可能是神经计算的模型。我们表明,HDA 的数值性能与现有算法相当。在渐近状态下,HDA 的表示误差随时间 t 衰减为 1/t。HDA 对时变噪声是稳定的;具体来说,对于高斯白噪声,代表误差的衰减为 1/√t。