Chicharro Daniel, Pica Giuseppe, Panzeri Stefano
Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA.
Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy.
Entropy (Basel). 2018 Mar 5;20(3):169. doi: 10.3390/e20030169.
Understanding how different information sources together transmit information is crucial in many domains. For example, understanding the neural code requires characterizing how different neurons contribute unique, redundant, or synergistic pieces of information about sensory or behavioral variables. Williams and Beer (2010) proposed a partial information decomposition (PID) that separates the mutual information that a set of sources contains about a set of targets into nonnegative terms interpretable as these pieces. Quantifying redundancy requires assigning an identity to different information pieces, to assess when information is common across sources. Harder et al. (2013) proposed an identity axiom that imposes necessary conditions to quantify qualitatively common information. However, Bertschinger et al. (2012) showed that, in a counterexample with deterministic target-source dependencies, the identity axiom is incompatible with ensuring PID nonnegativity. Here, we study systematically the consequences of information identity criteria that assign identity based on associations between target and source variables resulting from deterministic dependencies. We show how these criteria are related to the identity axiom and to previously proposed redundancy measures, and we characterize how they lead to negative PID terms. This constitutes a further step to more explicitly address the role of information identity in the quantification of redundancy. The implications for studying neural coding are discussed.
理解不同信息源如何共同传递信息在许多领域都至关重要。例如,理解神经编码需要刻画不同神经元如何对关于感觉或行为变量的独特、冗余或协同的信息片段做出贡献。威廉姆斯和比尔(2010年)提出了一种部分信息分解(PID)方法,该方法将一组源关于一组目标所包含的互信息分离为可解释为这些信息片段的非负项。量化冗余需要为不同的信息片段赋予一个标识,以评估信息何时在各源之间是共同的。哈德等人(2013年)提出了一个标识公理,该公理施加了定性量化共同信息的必要条件。然而,贝尔施inger等人(2012年)表明,在一个具有确定性目标 - 源依赖关系的反例中,标识公理与确保PID非负性不相容。在这里,我们系统地研究了基于确定性依赖关系导致的目标和源变量之间的关联来分配标识的信息标识标准的后果。我们展示了这些标准如何与标识公理以及先前提出的冗余度量相关,并且我们刻画了它们如何导致负的PID项。这构成了更明确地解决信息标识在冗余量化中的作用的进一步步骤。讨论了其对研究神经编码的影响。