Suppr超能文献

狄拉克方程信号处理:物理学推动拓扑机器学习。

Dirac-equation signal processing: Physics boosts topological machine learning.

作者信息

Wang Runyue, Tian Yu, Liò Pietro, Bianconi Ginestra

机构信息

Centre for Complex Systems, School of Mathematical Sciences, Queen Mary University of London, London E1 4NS, United Kingdom.

Nordita, KTH Royal Institute of Technology and Stockholm University, Stockholm SE-106 91, Sweden.

出版信息

PNAS Nexus. 2025 May 2;4(5):pgaf139. doi: 10.1093/pnasnexus/pgaf139. eCollection 2025 May.

Abstract

Topological signals are variables or features associated with both nodes and edges of a network. Recently, in the context of topological machine learning, great attention has been devoted to signal processing of such topological signals. Most of the previous topological signal processing algorithms treat node and edge signals separately and work under the hypothesis that the true signal is smooth and/or well approximated by a harmonic eigenvector of the higher-order Laplacian, which may be violated in practice. Here, we propose Dirac-equation signal processing, a framework for efficiently reconstructing true signals on nodes and edges, also if they are not smooth or harmonic, by processing them jointly. The proposed physics-inspired algorithm is based on the spectral properties of the topological Dirac operator. It leverages the mathematical structure of the topological Dirac equation to boost the performance of the signal processing algorithm. We discuss how the relativistic dispersion relation obeyed by the topological Dirac equation can be used to assess the quality of the signal reconstruction. Finally, we demonstrate the improved performance of the algorithm with respect to previous algorithms. Specifically, we show that Dirac-equation signal processing can also be used efficiently if the true signal is a nontrivial linear combination of more than one eigenstate of the Dirac equation, as it generally occurs for real signals.

摘要

拓扑信号是与网络的节点和边相关联的变量或特征。最近,在拓扑机器学习的背景下,人们对这种拓扑信号的信号处理给予了极大关注。以前的大多数拓扑信号处理算法分别处理节点和边信号,并且在真实信号平滑和/或可以由高阶拉普拉斯算子的调和特征向量很好近似的假设下工作,而这在实际中可能不成立。在此,我们提出狄拉克方程信号处理方法,这是一个通过联合处理节点和边上的信号来有效重建真实信号的框架,即使这些信号不平滑或不是调和的。所提出的受物理启发的算法基于拓扑狄拉克算子的谱特性。它利用拓扑狄拉克方程的数学结构来提高信号处理算法的性能。我们讨论了拓扑狄拉克方程所遵循的相对论色散关系如何可用于评估信号重建的质量。最后,我们展示了该算法相对于先前算法的改进性能。具体而言,我们表明,如果真实信号是狄拉克方程多个本征态的非平凡线性组合,就像实际信号中通常出现的那样,狄拉克方程信号处理也可以有效使用。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验