Kashif Muhammad, Shafique Muhammad
eBrain Lab, Division of Engineering, New York University Abu Dhabi, PO Box 129188, Abu Dhabi, United Arab Emirates.
Center for Quantum and Topological Systems, NYUAD Research Institute, New York University, Abu Dhabi, United Arab Emirates.
Sci Rep. 2025 Jul 1;15(1):21764. doi: 10.1038/s41598-025-06035-4.
In this paper, we explore methods to enhance the performance of one of the frequently used variants of Quantum Convolutional Neural Networks, known as Quanvolutional Neural Networks (QuNNs) by introducing trainable quanvolutional layers and addressing the challenges associated with training multi-layered or deep QuNNs. Traditional QuNNs mostly rely on static (non-trainable) quanvolutional layers, limiting their feature extraction capabilities. Our approach enables the training of these layers, significantly improving the scalability and learning potential of QuNNs. However, multi-layered deep QuNNs face difficulties in gradient-based optimization due to limited gradient flow across all the layers of the network. To overcome this, we propose Residual Quanvolutional Neural Networks (ResQuNNs), which utilize residual learning by adding skip connections between quanvolutional layers. These residual blocks enhance gradient flow throughout the network, facilitating effective training in deep QuNNs, thus enabling deep learning in QuNNs. Moreover, we provide empirical evidence on the optimal placement of these residual blocks, demonstrating how strategic configurations improve gradient flow and lead to more efficient training. Our findings represent a significant advancement in quantum deep learning, opening new possibilities for both theoretical exploration and practical quantum computing applications.
在本文中,我们探索了一些方法来提升量子卷积神经网络(QuNNs)中一种常用变体的性能,这种变体被称为量子卷积神经网络(QuNNs)。我们通过引入可训练的量子卷积层,并解决与训练多层或深度QuNNs相关的挑战来实现这一目标。传统的QuNNs大多依赖于静态(不可训练)的量子卷积层,这限制了它们的特征提取能力。我们的方法能够对这些层进行训练,显著提高了QuNNs的可扩展性和学习潜力。然而,由于网络所有层之间的梯度流有限,多层深度QuNNs在基于梯度的优化方面面临困难。为了克服这一问题,我们提出了残差量子卷积神经网络(ResQuNNs),它通过在量子卷积层之间添加跳跃连接来利用残差学习。这些残差块增强了整个网络的梯度流,便于在深度QuNNs中进行有效训练,从而实现了QuNNs中的深度学习。此外,我们提供了关于这些残差块最佳放置的实证证据,展示了策略性配置如何改善梯度流并导致更高效的训练。我们的研究结果代表了量子深度学习的一项重大进展,为理论探索和实际量子计算应用开辟了新的可能性。