Ferraz Mariana Sacrini Ayres, Kihara Alexandre Hiroaki
Centro de Matemática, Computação e Cognição (CMCC), Universidade Federal do ABC (UFABC), São Bernardo do Campo, São Paulo 09606-045, Brasil.
Phys Rev E. 2019 Jun;99(6-1):062115. doi: 10.1103/PhysRevE.99.062115.
Shannon's concept of information is related to predictability. In a binary series, the value of information relies on the frequency of 0's and 1's, or how it is expected to occur. However, information entropy does not consider the bias in randomness related to autocorrelation. In fact, it is possible for a binary temporal series to carry both short- and long-term memories related to the sequential distribution of 0's and 1's. Although the Hurst exponent measures the range of autocorrelation, there is a lack of mathematical connection between information entropy and autocorrelation present in the series. To fill this important gap, we combined numerical simulations and an analytical approach to determine how information entropy changes according to the frequency of 0's and 1's and the Hurst exponent. Indeed, we were able to determine how predictability depends on both parameters. Our findings are certainly useful to several fields when binary times series are applied, such as neuroscience to econophysics.
香农的信息概念与可预测性相关。在一个二元序列中,信息的值取决于0和1的频率,或者说取决于其预期的出现方式。然而,信息熵并未考虑与自相关有关的随机性偏差。事实上,一个二元时间序列有可能同时携带与0和1的序列分布相关的短期和长期记忆。尽管赫斯特指数衡量自相关的范围,但在该序列中,信息熵与自相关之间缺乏数学联系。为了填补这一重要空白,我们结合了数值模拟和一种分析方法,以确定信息熵如何根据0和1的频率以及赫斯特指数而变化。确实,我们能够确定可预测性如何依赖于这两个参数。当应用二元时间序列时,我们的发现对于多个领域肯定是有用的,比如从神经科学到经济物理学等领域。