Lorenz Gabriel Matías, Engel Nicola Marie, Celotto Marco, Koçillari Loren, Curreli Sebastiano, Fellin Tommaso, Panzeri Stefano
Institute for Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Hamburg, Germany.
Optical Approaches to Brain Function Laboratory, Istituto Italiano di Tecnologia, Genova, Italy.
PLoS Comput Biol. 2025 Apr 15;21(4):e1012934. doi: 10.1371/journal.pcbi.1012934. eCollection 2025 Apr.
Information theory has deeply influenced the conceptualization of brain information processing and is a mainstream framework for analyzing how neural networks in the brain process information to generate behavior. Information theory tools have been initially conceived and used to study how information about sensory variables is encoded by the activity of small neural populations. However, recent multivariate information theoretic advances have enabled addressing how information is exchanged across areas and used to inform behavior. Moreover, its integration with dimensionality-reduction techniques has enabled addressing information encoding and communication by the activity of large neural populations or many brain areas, as recorded by multichannel activity measurements in functional imaging and electrophysiology. Here, we provide a Multivariate Information in Neuroscience Toolbox (MINT) that combines these new methods with statistical tools for robust estimation from limited-size empirical datasets. We demonstrate the capabilities of MINT by applying it to both simulated and real neural data recorded with electrophysiology or calcium imaging, but all MINT functions are equally applicable to other brain-activity measurement modalities. We highlight the synergistic opportunities that combining its methods afford for reverse engineering of specific information processing and flow between neural populations or areas, and for discovering how information processing functions emerge from interactions between neurons or areas. MINT works on Linux, Windows and macOS operating systems, is written in MATLAB (requires MATLAB version 2018b or newer) and depends on 4 native MATLAB toolboxes. The calculation of one possible way to compute information redundancy requires the installation and compilation of C files (made available by us also as pre-compiled files). MINT is freely available at https://github.com/panzerilab/MINT with DOI doi.org/10.5281/zenodo.13998526 and operates under a GNU GPLv3 license.
信息论对大脑信息处理的概念化产生了深远影响,是分析大脑神经网络如何处理信息以产生行为的主流框架。信息论工具最初是为研究感觉变量的信息如何由小神经群体的活动进行编码而构思和使用的。然而,最近多元信息论的进展使得能够探讨信息如何在不同脑区之间交换并用于指导行为。此外,它与降维技术的整合使得能够通过大型神经群体或多个脑区的活动来探讨信息编码和通信,这些活动是通过功能成像和电生理学中的多通道活动测量记录下来的。在这里,我们提供了一个神经科学多元信息工具箱(MINT),它将这些新方法与统计工具相结合,以便从有限大小的经验数据集中进行稳健估计。我们通过将MINT应用于用电生理学或钙成像记录的模拟和真实神经数据来展示其功能,但MINT的所有功能同样适用于其他脑活动测量方式。我们强调了将其方法结合起来为神经群体或脑区之间特定信息处理和信息流的逆向工程,以及发现信息处理功能如何从神经元或脑区之间的相互作用中产生所带来的协同机会。MINT可在Linux、Windows和macOS操作系统上运行,用MATLAB编写(需要MATLAB 2018b或更高版本),并依赖于4个原生MATLAB工具箱。计算一种可能的信息冗余计算方法需要安装和编译C文件(我们也提供了预编译文件)。MINT可在https://github.com/panzerilab/MINT上免费获取,其数字对象标识符为doi.org/10.5281/zenodo.13998526,并在GNU GPLv3许可下运行。