Electrical Engineering, College of Engineering, Qatar University, Qatar.
Department of Computing Sciences, Tampere University, Finland.
Neural Netw. 2021 Aug;140:294-308. doi: 10.1016/j.neunet.2021.02.028. Epub 2021 Mar 17.
Operational Neural Networks (ONNs) have recently been proposed to address the well-known limitations and drawbacks of conventional Convolutional Neural Networks (CNNs) such as network homogeneity with the sole linear neuron model. ONNs are heterogeneous networks with a generalized neuron model. However the operator search method in ONNs is not only computationally demanding, but the network heterogeneity is also limited since the same set of operators will then be used for all neurons in each layer. Moreover, the performance of ONNs directly depends on the operator set library used, which introduces a certain risk of performance degradation especially when the optimal operator set required for a particular task is missing from the library. In order to address these issues and achieve an ultimate heterogeneity level to boost the network diversity along with computational efficiency, in this study we propose Self-organized ONNs (Self-ONNs) with generative neurons that can adapt (optimize) the nodal operator of each connection during the training process. Moreover, this ability voids the need of having a fixed operator set library and the prior operator search within the library in order to find the best possible set of operators. We further formulate the training method to back-propagate the error through the operational layers of Self-ONNs. Experimental results over four challenging problems demonstrate the superior learning capability and computational efficiency of Self-ONNs over conventional ONNs and CNNs.
运作型神经网络(ONNs)最近被提出来,以解决传统卷积神经网络(CNNs)的一些众所周知的限制和缺点,如网络同质性,仅使用线性神经元模型。ONNs 是具有广义神经元模型的异构网络。然而,ONNs 中的操作符搜索方法不仅计算量很大,而且网络的异质性也受到限制,因为同一组操作符将用于每层的所有神经元。此外,ONNs 的性能直接取决于所使用的操作符集库,这引入了一定的性能下降风险,特别是当库中缺少特定任务所需的最佳操作符集时。为了解决这些问题,并实现最终的异质性水平,以提高网络多样性和计算效率,在本研究中,我们提出了具有生成神经元的自组织型 ONNs(Self-ONNs),这些神经元可以在训练过程中自适应(优化)每个连接的节点操作符。此外,这种能力消除了对固定操作符集库的需求,以及在库中进行预先操作符搜索以找到最佳操作符集的需求。我们进一步制定了训练方法,通过 Self-ONNs 的操作层向后传播误差。在四个具有挑战性的问题上的实验结果表明,Self-ONNs 在学习能力和计算效率方面优于传统的 ONNs 和 CNNs。