Azami Hamed, Escudero Javier
School of Engineering, Institute for Digital Communications, The University of Edinburgh, Edinburgh EH9 3FB, UK.
Entropy (Basel). 2018 Feb 22;20(2):138. doi: 10.3390/e20020138.
The evaluation of complexity in univariate signals has attracted considerable attention in recent years. This is often done using the framework of Multiscale Entropy, which entails two basic steps: coarse-graining to consider multiple temporal scales, and evaluation of irregularity for each of those scales with entropy estimators. Recent developments in the field have proposed modifications to this approach to facilitate the analysis of short-time series. However, the role of the downsampling in the classical coarse-graining process and its relationships with alternative filtering techniques has not been systematically explored yet. Here, we assess the impact of coarse-graining in multiscale entropy estimations based on both Sample Entropy and Dispersion Entropy. We compare the classical moving average approach with low-pass Butterworth filtering, both with and without downsampling, and empirical mode decomposition in Intrinsic Multiscale Entropy, in selected synthetic data and two real physiological datasets. The results show that when the sampling frequency is low or high, downsampling respectively decreases or increases the entropy values. Our results suggest that, when dealing with long signals and relatively low levels of noise, the refine composite method makes little difference in the quality of the entropy estimation at the expense of considerable additional computational cost. It is also found that downsampling within the coarse-graining procedure may not be required to quantify the complexity of signals, especially for short ones. Overall, we expect these results to contribute to the ongoing discussion about the development of stable, fast and robust-to-noise multiscale entropy techniques suited for either short or long recordings.
近年来,单变量信号复杂性评估备受关注。这通常借助多尺度熵框架来完成,该框架包含两个基本步骤:粗粒化以考虑多个时间尺度,以及使用熵估计器对每个尺度的不规则性进行评估。该领域的最新进展已对这种方法提出改进,以利于对短时间序列进行分析。然而,经典粗粒化过程中降采样的作用及其与替代滤波技术的关系尚未得到系统探讨。在此,我们基于样本熵和离散熵评估粗粒化在多尺度熵估计中的影响。我们在选定的合成数据和两个真实生理数据集上,将经典移动平均方法与低通巴特沃斯滤波(包括有和没有降采样的情况)以及本征多尺度熵中的经验模式分解进行比较。结果表明,当采样频率较低或较高时,降采样分别会降低或增加熵值。我们的结果表明,在处理长信号和相对低噪声水平时,精细复合方法在熵估计质量上差异不大,但会付出相当大的额外计算成本。还发现,在粗粒化过程中可能不需要降采样来量化信号的复杂性,特别是对于短信号。总体而言,我们期望这些结果有助于持续讨论适合短记录或长记录的稳定、快速且抗噪声的多尺度熵技术的发展。