• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

部分信息分解作为一种统一的方法来指定神经目标函数。

Partial information decomposition as a unified approach to the specification of neural goal functions.

作者信息

Wibral Michael, Priesemann Viola, Kay Jim W, Lizier Joseph T, Phillips William A

机构信息

MEG Unit, Brain Imaging Center, Goethe University, Heinrich Hoffmann Straße 10, 60528 Frankfurt am Main, Germany.

Department of Non-linear Dynamics, Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany.

出版信息

Brain Cogn. 2017 Mar;112:25-38. doi: 10.1016/j.bandc.2015.09.004. Epub 2015 Oct 21.

DOI:10.1016/j.bandc.2015.09.004
PMID:26475739
Abstract

In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six-layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a 'goal function', of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. 'edge filtering', 'working memory'). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon's mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. First, we review the framework of PID. Then we apply it to reevaluate and analyze several earlier proposals of information theoretic neural goal functions (predictive coding, infomax and coherent infomax, efficient coding). We find that PID allows to compare these goal functions in a common framework, and also provides a versatile approach to design new goal functions from first principles. Building on this, we design and analyze a novel goal function, called 'coding with synergy', which builds on combining external input and prior knowledge in a synergistic manner. We suggest that this novel goal function may be highly useful in neural information processing.

摘要

在许多神经系统中,解剖学基序会反复出现,尽管它们结构相似,但却能执行截然不同的任务。这种基序的一个典型例子是六层新皮层的典型微电路,它在各个皮层区域重复出现,并参与多种不同的任务(例如感觉、认知或运动任务)。这一观察结果引发了人们对寻找该结构中信息处理的共同潜在原则(即“目标函数”)的兴趣。根据定义,这样一个通用的目标函数不能用特定于处理领域的语言来表述(例如“边缘滤波”“工作记忆”)。因此,为了阐述这样一个原则,我们必须使用一个与领域无关的框架。信息论提供了这样一个框架。然而,虽然信息论的经典框架关注的是一个输入与一个输出之间的关系(香农互信息),但我们认为神经信息处理关键取决于多个输入的组合以产生处理器的输出。为了解决这个问题,我们使用了香农信息论的一个最新扩展,称为部分信息分解(PID)。PID能够量化几个输入分别(唯一信息)、冗余(共享信息)或仅共同(协同信息)提供给输出的信息。首先,我们回顾PID的框架。然后将其应用于重新评估和分析信息论神经目标函数的几个早期提议(预测编码、信息最大化和相干信息最大化、高效编码)。我们发现PID能够在一个通用框架中比较这些目标函数,并且还提供了一种从第一原理设计新目标函数的通用方法。在此基础上,我们设计并分析了一种名为“协同编码”的新型目标函数,它基于以协同方式组合外部输入和先验知识。我们认为这种新型目标函数在神经信息处理中可能非常有用。

相似文献

1
Partial information decomposition as a unified approach to the specification of neural goal functions.部分信息分解作为一种统一的方法来指定神经目标函数。
Brain Cogn. 2017 Mar;112:25-38. doi: 10.1016/j.bandc.2015.09.004. Epub 2015 Oct 21.
2
Coherent Infomax as a computational goal for neural systems.相干信息极大似然作为神经计算系统的目标
Bull Math Biol. 2011 Feb;73(2):344-72. doi: 10.1007/s11538-010-9564-x. Epub 2010 Sep 4.
3
Implications of Information Theory for Computational Modeling of Schizophrenia.信息论对精神分裂症计算模型的启示
Comput Psychiatr. 2017 Oct 1;1:82-101. doi: 10.1162/CPSY_a_00004. eCollection 2017 Oct.
4
Information processing in dendrites II. Information theoretic complexity.树突中的信息处理II. 信息论复杂性
Neural Netw. 2001 Oct;14(8):1005-22. doi: 10.1016/s0893-6080(01)00085-5.
5
Engineering Aspects of Olfaction嗅觉的工程学方面
6
Combinatorial neural codes from a mathematical coding theory perspective.从数学编码理论角度看组合神经码
Neural Comput. 2013 Jul;25(7):1891-925. doi: 10.1162/NECO_a_00459.
7
Performance of a Computational Model of the Mammalian Olfactory System哺乳动物嗅觉系统计算模型的性能
8
Information-theoretic analyses of neural data to minimize the effect of researchers' assumptions in predictive coding studies.基于信息论的神经数据分析,以最小化预测编码研究中研究者假设的影响。
PLoS Comput Biol. 2023 Nov 17;19(11):e1011567. doi: 10.1371/journal.pcbi.1011567. eCollection 2023 Nov.
9
A review of predictive coding algorithms.预测编码算法综述。
Brain Cogn. 2017 Mar;112:92-97. doi: 10.1016/j.bandc.2015.11.003. Epub 2016 Jan 19.
10
Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems.静态和动态高斯系统中协同与冗余信息共享的探索
Phys Rev E Stat Nonlin Soft Matter Phys. 2015 May;91(5):052802. doi: 10.1103/PhysRevE.91.052802. Epub 2015 May 8.

引用本文的文献

1
Characterising high-order interdependence via entropic conjugation.通过熵共轭表征高阶相互依存关系。
Commun Phys. 2025;8(1):347. doi: 10.1038/s42005-025-02250-7. Epub 2025 Aug 23.
2
Higher-order and distributed synergistic functional interactions encode information gain in goal-directed learning.高阶和分布式协同功能相互作用在目标导向学习中编码信息增益。
Nat Commun. 2025 Aug 5;16(1):7179. doi: 10.1038/s41467-025-62507-1.
3
Interdependence patterns of multifrequency oscillations predict visuomotor behavior.多频振荡的相互依赖模式预测视觉运动行为。
Netw Neurosci. 2025 May 8;9(2):712-742. doi: 10.1162/netn_a_00440. eCollection 2025.
4
Synergistic small worlds that drive technological sophistication.推动技术复杂性的协同小世界。
PNAS Nexus. 2025 Mar 26;4(4):pgaf102. doi: 10.1093/pnasnexus/pgaf102. eCollection 2025 Apr.
5
MINT: A toolbox for the analysis of multivariate neural information coding and transmission.MINT:一个用于多元神经信息编码与传输分析的工具箱。
PLoS Comput Biol. 2025 Apr 15;21(4):e1012934. doi: 10.1371/journal.pcbi.1012934. eCollection 2025 Apr.
6
Broadcast Channel Cooperative Gain: An Operational Interpretation of Partial Information Decomposition.广播信道协作增益:部分信息分解的一种操作解释
Entropy (Basel). 2025 Mar 15;27(3):310. doi: 10.3390/e27030310.
7
A general framework for interpretable neural learning based on local information-theoretic goal functions.基于局部信息论目标函数的可解释神经学习通用框架。
Proc Natl Acad Sci U S A. 2025 Mar 11;122(10):e2408125122. doi: 10.1073/pnas.2408125122. Epub 2025 Mar 5.
8
Dissipation Alters Modes of Information Encoding in Small Quantum Reservoirs near Criticality.耗散改变临界附近小量子库中的信息编码模式。
Entropy (Basel). 2025 Jan 18;27(1):88. doi: 10.3390/e27010088.
9
Partial Information Decomposition: Redundancy as Information Bottleneck.部分信息分解:作为信息瓶颈的冗余度
Entropy (Basel). 2024 Jun 26;26(7):546. doi: 10.3390/e26070546.
10
A Partial Information Decomposition for Multivariate Gaussian Systems Based on Information Geometry.基于信息几何的多元高斯系统的部分信息分解
Entropy (Basel). 2024 Jun 25;26(7):542. doi: 10.3390/e26070542.