Gibson Jerry D
Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA 93106-9560, USA.
Entropy (Basel). 2020 May 30;22(6):608. doi: 10.3390/e22060608.
In many applications, intelligent agents need to identify any structure or apparent randomness in an environment and respond appropriately. We use the relative entropy to separate and quantify the presence of both linear and nonlinear redundancy in a sequence and we introduce the new quantities of total mutual information gain and incremental mutual information gain. We illustrate how these new quantities can be used to analyze and characterize the structures and apparent randomness for purely autoregressive sequences and for speech signals with long and short term linear redundancies. The mutual information gain is shown to be an important new tool for capturing and quantifying learning for sequence modeling and analysis.
在许多应用中,智能代理需要识别环境中的任何结构或明显的随机性,并做出适当响应。我们使用相对熵来分离和量化序列中线性和非线性冗余的存在,并引入了总互信息增益和增量互信息增益的新量。我们说明了如何使用这些新量来分析和表征纯自回归序列以及具有长期和短期线性冗余的语音信号的结构和明显的随机性。互信息增益被证明是用于捕获和量化序列建模与分析中的学习的重要新工具。