Image Processing Laboratory, Parc Cientific, University of Valencia, Valencia, Spain.
Neural Netw. 2022 Feb;146:85-97. doi: 10.1016/j.neunet.2021.11.016. Epub 2021 Nov 17.
Shannon's entropy or an extension of Shannon's entropy can be used to quantify information transmission between or among variables. Mutual information is the pair-wise information that captures nonlinear relationships between variables. It is more robust than linear correlation methods. Beyond mutual information, two generalizations are defined for multivariate distributions: interaction information or co-information and total correlation or multi-mutual information. In comparison to mutual information, interaction information and total correlation are underutilized and poorly studied in applied neuroscience research. Quantifying information flow between brain regions is not explicitly explained in neuroscience by interaction information and total correlation. This article aims to clarify the distinctions between the neuroscience concepts of mutual information, interaction information, and total correlation. Additionally, we proposed a novel method for determining the interaction information between three variables using total correlation and conditional mutual information. On the other hand, how to apply it properly in practical situations. We supplied both simulation experiments and real neural studies to estimate functional connectivity in the brain with the above three higher-order information-theoretic approaches. In order to capture redundancy information for multivariate variables, we discovered that interaction information and total correlation were both robust, and it could be able to capture both well-known and yet-to-be-discovered functional brain connections.
香农熵或香农熵的扩展可用于量化变量之间或变量之间的信息传输。互信息是捕捉变量之间非线性关系的两两信息。它比线性相关方法更稳健。除了互信息,还定义了两种用于多变量分布的推广:交互信息或共信息和总相关或多互信息。与互信息相比,交互信息和总相关在应用神经科学研究中未得到充分利用和研究。交互信息和总相关并未在神经科学中明确解释脑区之间信息流的量化。本文旨在阐明互信息、交互信息和总相关这三个神经科学概念之间的区别。此外,我们提出了一种使用总相关和条件互信息来确定三个变量之间交互信息的新方法。另一方面,如何在实际情况下正确应用它。我们提供了模拟实验和真实的神经研究,以使用上述三种高阶信息论方法来估计大脑中的功能连接。为了捕捉多元变量的冗余信息,我们发现交互信息和总相关都是稳健的,并且可以很好地捕捉已知和尚未发现的功能大脑连接。