Kirdin Alexander, Sidorov Sergey, Zolotykh Nikolai
Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University, 603022 Nizhni Novgorod, Russia.
Institute for Computational Modelling, Russian Academy of Sciences, Siberian Branch, 660036 Krasnoyarsk, Russia.
Entropy (Basel). 2022 Nov 10;24(11):1635. doi: 10.3390/e24111635.
The Rosenblatt's first theorem about the omnipotence of shallow networks states that elementary perceptrons can solve any classification problem if there are no discrepancies in the training set. Minsky and Papert considered elementary perceptrons with restrictions on the neural inputs: a bounded number of connections or a relatively small diameter of the receptive field for each neuron at the hidden layer. They proved that under these constraints, an elementary perceptron cannot solve some problems, such as the connectivity of input images or the parity of pixels in them. In this note, we demonstrated Rosenblatt's first theorem at work, showed how an elementary perceptron can solve a version of the travel maze problem, and analysed the complexity of that solution. We also constructed a deep network algorithm for the same problem. It is much more efficient. The shallow network uses an exponentially large number of neurons on the hidden layer (Rosenblatt's -elements), whereas for the deep network, the second-order polynomial complexity is sufficient. We demonstrated that for the same complex problem, the deep network can be much smaller and reveal a heuristic behind this effect.
罗森布拉特关于浅层网络全能性的第一个定理指出,如果训练集中不存在差异,基本感知器就能解决任何分类问题。明斯基和佩珀特考虑了对神经输入有限制的基本感知器:连接数量有限,或者隐藏层中每个神经元的感受野直径相对较小。他们证明,在这些约束条件下,基本感知器无法解决一些问题,比如输入图像的连通性或其中像素的奇偶性。在本笔记中,我们展示了罗森布拉特第一个定理的实际应用,说明了基本感知器如何解决旅行迷宫问题的一个版本,并分析了该解决方案的复杂性。我们还为同一问题构建了一种深度网络算法。它效率更高。浅层网络在隐藏层使用指数级数量的神经元(罗森布拉特元素),而对于深度网络,二阶多项式复杂性就足够了。我们证明,对于同一个复杂问题,深度网络可以小得多,并揭示了这种效应背后的一种启发式方法。