Suppr超能文献

赫斯特熵:一种基于分形相关过程确定二元序列可预测性的方法。

Hurst entropy: A method to determine predictability in a binary series based on a fractal-related process.

作者信息

Ferraz Mariana Sacrini Ayres, Kihara Alexandre Hiroaki

机构信息

Centro de Matemática, Computação e Cognição (CMCC), Universidade Federal do ABC (UFABC), São Bernardo do Campo, São Paulo 09606-045, Brasil.

出版信息

Phys Rev E. 2019 Jun;99(6-1):062115. doi: 10.1103/PhysRevE.99.062115.

Abstract

Shannon's concept of information is related to predictability. In a binary series, the value of information relies on the frequency of 0's and 1's, or how it is expected to occur. However, information entropy does not consider the bias in randomness related to autocorrelation. In fact, it is possible for a binary temporal series to carry both short- and long-term memories related to the sequential distribution of 0's and 1's. Although the Hurst exponent measures the range of autocorrelation, there is a lack of mathematical connection between information entropy and autocorrelation present in the series. To fill this important gap, we combined numerical simulations and an analytical approach to determine how information entropy changes according to the frequency of 0's and 1's and the Hurst exponent. Indeed, we were able to determine how predictability depends on both parameters. Our findings are certainly useful to several fields when binary times series are applied, such as neuroscience to econophysics.

摘要

香农的信息概念与可预测性相关。在一个二元序列中,信息的值取决于0和1的频率,或者说取决于其预期的出现方式。然而,信息熵并未考虑与自相关有关的随机性偏差。事实上,一个二元时间序列有可能同时携带与0和1的序列分布相关的短期和长期记忆。尽管赫斯特指数衡量自相关的范围,但在该序列中,信息熵与自相关之间缺乏数学联系。为了填补这一重要空白,我们结合了数值模拟和一种分析方法,以确定信息熵如何根据0和1的频率以及赫斯特指数而变化。确实,我们能够确定可预测性如何依赖于这两个参数。当应用二元时间序列时,我们的发现对于多个领域肯定是有用的,比如从神经科学到经济物理学等领域。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验