Cho Gyungmin, Kim Dohun
Department of Physics and Astronomy, and Institute of Applied Physics, Seoul National University, Seoul, 08826, South Korea.
Nat Commun. 2024 Aug 30;15(1):7552. doi: 10.1038/s41467-024-51932-3.
Advancements in the implementation of quantum hardware have enabled the acquisition of data that are intractable for emulation with classical computers. The integration of classical machine learning (ML) algorithms with these data holds potential for unveiling obscure patterns. Although this hybrid approach extends the class of efficiently solvable problems compared to using only classical computers, this approach has been only realized for solving restricted problems because of the prevalence of noise in current quantum computers. Here, we extend the applicability of the hybrid approach to problems of interest in many-body physics, such as predicting the properties of the ground state of a given Hamiltonian and classifying quantum phases. By performing experiments with various error-reducing procedures on superconducting quantum hardware with 127 qubits, we managed to acquire refined data from the quantum computer. This enabled us to demonstrate the successful implementation of theoretically suggested classical ML algorithms for systems with up to 44 qubits. Our results verify the scalability and effectiveness of the classical ML algorithms for processing quantum experimental data.
量子硬件实现方面的进展使得获取用经典计算机难以模拟的数据成为可能。将经典机器学习(ML)算法与这些数据相结合,有望揭示隐藏的模式。尽管与仅使用经典计算机相比,这种混合方法扩展了可有效解决问题的类别,但由于当前量子计算机中噪声普遍存在,这种方法仅用于解决受限问题。在这里,我们将混合方法的适用性扩展到多体物理中感兴趣的问题,例如预测给定哈密顿量基态的性质和对量子相进行分类。通过在具有127个量子比特的超导量子硬件上执行各种减少误差的程序进行实验,我们成功从量子计算机获取了精确数据。这使我们能够证明理论上建议的经典ML算法在多达44个量子比特的系统上的成功实现。我们的结果验证了经典ML算法处理量子实验数据的可扩展性和有效性。