Zavatone-Veth Jacob A, Pehlevan Cengiz
Department of Physics and Center for Brain Science, Harvard University, Cambridge, MA 02138, U.S.A.
Center for Brain Science and John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138, U.S.A.
Neural Comput. 2022 Apr 15;34(5):1136-1142. doi: 10.1162/neco_a_01494.
In this short note, we reify the connection between work on the storage capacity problem in wide two-layer treelike neural networks and the rapidly growing body of literature on kernel limits of wide neural networks. Concretely, we observe that the "effective order parameter" studied in the statistical mechanics literature is exactly equivalent to the infinite-width neural network gaussian process kernel. This correspondence connects the expressivity and trainability of wide two-layer neural networks.
在本简短笔记中,我们明确了关于宽两层树状神经网络存储容量问题的研究与关于宽神经网络核极限的迅速增长的文献之间的联系。具体而言,我们观察到统计力学文献中研究的“有效序参量”与无限宽度神经网络高斯过程核完全等价。这种对应关系将宽两层神经网络的表达能力和可训练性联系起来。