Suppr超能文献

深度神经网络模型的个体差异。

Individual differences among deep neural network models.

机构信息

MRC Cognition and Brain Sciences Unit, University of Cambridge, 15 Chaucer Road, Cambridge, CB2 7EF, UK.

Zuckerman Institute, Columbia University, 3227 Broadway, New York, NY, 10027, USA.

出版信息

Nat Commun. 2020 Nov 12;11(1):5725. doi: 10.1038/s41467-020-19632-w.

Abstract

Deep neural networks (DNNs) excel at visual recognition tasks and are increasingly used as a modeling framework for neural computations in the primate brain. Just like individual brains, each DNN has a unique connectivity and representational profile. Here, we investigate individual differences among DNN instances that arise from varying only the random initialization of the network weights. Using tools typically employed in systems neuroscience, we show that this minimal change in initial conditions prior to training leads to substantial differences in intermediate and higher-level network representations despite similar network-level classification performance. We locate the origins of the effects in an under-constrained alignment of category exemplars, rather than misaligned category centroids. These results call into question the common practice of using single networks to derive insights into neural information processing and rather suggest that computational neuroscientists working with DNNs may need to base their inferences on groups of multiple network instances.

摘要

深度神经网络 (DNN) 在视觉识别任务中表现出色,并且越来越多地被用作灵长类动物大脑中神经计算的建模框架。就像单个大脑一样,每个 DNN 都具有独特的连接和表示模式。在这里,我们研究了仅通过改变网络权重的随机初始化而产生的 DNN 实例之间的个体差异。使用通常在系统神经科学中使用的工具,我们表明,尽管在网络级别分类性能相似,但在训练之前初始条件的这种最小变化会导致中间和更高层次的网络表示中出现实质性差异。我们将这些影响的起源定位在类别示例的约束不足的对齐中,而不是未对齐的类别质心。这些结果对使用单个网络来深入了解神经信息处理的常见做法提出了质疑,而是表明,使用 DNN 的计算神经科学家可能需要基于多个网络实例的组来进行推断。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f6b/7665054/902702a2d452/41467_2020_19632_Fig1_HTML.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验