Grassucci Eleonora, Zhang Aston, Comminiello Danilo
IEEE Trans Neural Netw Learn Syst. 2024 Jun;35(6):8293-8305. doi: 10.1109/TNNLS.2022.3226772. Epub 2024 Jun 3.
Hypercomplex neural networks have proven to reduce the overall number of parameters while ensuring valuable performance by leveraging the properties of Clifford algebras. Recently, hypercomplex linear layers have been further improved by involving efficient parameterized Kronecker products. In this article, we define the parameterization of hypercomplex convolutional layers and introduce the family of parameterized hypercomplex neural networks (PHNNs) that are lightweight and efficient large-scale models. Our method grasps the convolution rules and the filter organization directly from data without requiring a rigidly predefined domain structure to follow. PHNNs are flexible to operate in any user-defined or tuned domain, from 1-D to [Formula: see text] regardless of whether the algebra rules are preset. Such a malleability allows processing multidimensional inputs in their natural domain without annexing further dimensions, as done, instead, in quaternion neural networks (QNNs) for 3-D inputs like color images. As a result, the proposed family of PHNNs operates with 1/n free parameters as regards its analog in the real domain. We demonstrate the versatility of this approach to multiple domains of application by performing experiments on various image datasets and audio datasets in which our method outperforms real and quaternion-valued counterparts. Full code is available at: https://github.com/eleGAN23/HyperNets.
超复数神经网络已被证明可以减少参数的总数,同时通过利用克利福德代数的性质来确保有价值的性能。最近,超复数线性层通过引入高效的参数化克罗内克积得到了进一步改进。在本文中,我们定义了超复数卷积层的参数化,并介绍了参数化超复数神经网络(PHNN)家族,它们是轻量级且高效的大规模模型。我们的方法直接从数据中掌握卷积规则和滤波器组织,而无需遵循严格预定义的域结构。PHNN在任何用户定义或调整的域中都能灵活运行,从1维到[公式:见原文],无论代数规则是否预先设定。这种可延展性允许在其自然域中处理多维输入,而无需像处理三维输入(如彩色图像)的四元数神经网络(QNN)那样附加更多维度。因此,所提出的PHNN家族在实域中的类似物方面以1/n的自由参数运行。我们通过在各种图像数据集和音频数据集上进行实验,证明了这种方法在多个应用领域的通用性,在这些实验中我们的方法优于实值和四元数值的对应方法。完整代码可在以下网址获取:https://github.com/eleGAN23/HyperNets。