Büth Carlson Moses, Acharya Kishor, Zanin Massimiliano
Institute for Cross-Disciplinary Physics and Complex Systems (IFISC), CSIC-UIB, 07122, Palma de Mallorca, Spain.
Sci Rep. 2025 Aug 11;15(1):29323. doi: 10.1038/s41598-025-14053-5.
Information theory, i.e. the mathematical analysis of information and of its processing, has become a tenet of modern science; yet, its use in real-world studies is usually hindered by its computational complexity, the lack of coherent software frameworks, and, as a consequence, low reproducibility. We here introduce infomeasure, an open-source Python package designed to provide robust tools for calculating a wide variety of information-theoretic measures, including entropies, mutual information, transfer entropy and divergences. It is designed for both discrete and continuous variables; implements state-of-the-art estimation techniques; and allows the calculation of local measure values, p-values and t-scores. By unifying these approaches under one consistent framework, infomeasure aims to mitigate common pitfalls, ensure reproducibility, and simplify the practical implementation of information-theoretic analyses. In this contribution, we explore the motivation and features of infomeasure; its validation, using known analytical solutions; and exemplify its utility in a case study involving the analysis of human brain time series.
信息论,即对信息及其处理过程的数学分析,已成为现代科学的一项基本原则;然而,它在实际研究中的应用通常受到其计算复杂性、缺乏连贯的软件框架以及由此导致的低可重复性的阻碍。我们在此介绍infomeasure,这是一个开源的Python包,旨在提供强大的工具来计算各种信息论度量,包括熵、互信息、转移熵和散度。它适用于离散和连续变量;实现了最先进的估计技术;并允许计算局部度量值、p值和t分数。通过在一个一致的框架下统一这些方法,infomeasure旨在避免常见的陷阱,确保可重复性,并简化信息论分析的实际实现。在本论文中,我们探讨了infomeasure的动机和特点;使用已知解析解对其进行的验证;并在一个涉及人类大脑时间序列分析的案例研究中举例说明了它的实用性。