• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

两部词典的故事:用神经网络研究单词表征的计算压力

A tale of two lexica: Investigating computational pressures on word representation with neural networks.

作者信息

Avcu Enes, Hwang Michael, Brown Kevin Scott, Gow David W

机构信息

Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA, United States.

Harvard College, Boston, MA, United States.

出版信息

Front Artif Intell. 2023 Mar 27;6:1062230. doi: 10.3389/frai.2023.1062230. eCollection 2023.

DOI:10.3389/frai.2023.1062230
PMID:37051161
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10083378/
Abstract

INTRODUCTION

The notion of a single localized store of word representations has become increasingly less plausible as evidence has accumulated for the widely distributed neural representation of wordform grounded in motor, perceptual, and conceptual processes. Here, we attempt to combine machine learning methods and neurobiological frameworks to propose a computational model of brain systems potentially responsible for wordform representation. We tested the hypothesis that the functional specialization of word representation in the brain is driven partly by computational optimization. This hypothesis directly addresses the unique problem of mapping sound and articulation vs. mapping sound and meaning.

RESULTS

We found that artificial neural networks trained on the mapping between sound and articulation performed poorly in recognizing the mapping between sound and meaning and vice versa. Moreover, a network trained on both tasks simultaneously could not discover the features required for efficient mapping between sound and higher-level cognitive states compared to the other two models. Furthermore, these networks developed internal representations reflecting specialized task-optimized functions without explicit training.

DISCUSSION

Together, these findings demonstrate that different task-directed representations lead to more focused responses and better performance of a machine or algorithm and, hypothetically, the brain. Thus, we imply that the functional specialization of word representation mirrors a computational optimization strategy given the nature of the tasks that the human brain faces.

摘要

引言

随着越来越多的证据表明基于运动、感知和概念过程的词形在神经上广泛分布,单一局部词表征存储的概念变得越来越不可信。在此,我们尝试结合机器学习方法和神经生物学框架,提出一个可能负责词形表征的脑系统计算模型。我们检验了这样一个假设,即大脑中词表征的功能特化部分是由计算优化驱动的。这个假设直接解决了映射声音与发音和映射声音与意义这一独特问题。

结果

我们发现,在声音与发音之间映射上训练的人工神经网络在识别声音与意义之间的映射时表现不佳,反之亦然。此外,与其他两个模型相比,同时在这两项任务上训练的网络无法发现声音与更高层次认知状态之间高效映射所需的特征。此外,这些网络在没有明确训练的情况下发展出了反映专门任务优化功能的内部表征。

讨论

这些发现共同表明,不同的任务导向表征会导致机器或算法(以及假设的大脑)产生更集中的反应和更好的性能。因此,我们认为,鉴于人类大脑所面临任务的性质,词表征的功能特化反映了一种计算优化策略。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/26f136738273/frai-06-1062230-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/83804c2e8ca4/frai-06-1062230-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/bfc0fffa7f10/frai-06-1062230-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/3d3f82f24050/frai-06-1062230-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/f44ae41789f6/frai-06-1062230-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/36078ab4c0fe/frai-06-1062230-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/696420b2b47f/frai-06-1062230-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/26f136738273/frai-06-1062230-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/83804c2e8ca4/frai-06-1062230-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/bfc0fffa7f10/frai-06-1062230-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/3d3f82f24050/frai-06-1062230-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/f44ae41789f6/frai-06-1062230-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/36078ab4c0fe/frai-06-1062230-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/696420b2b47f/frai-06-1062230-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/eb86/10083378/26f136738273/frai-06-1062230-g0007.jpg

相似文献

1
A tale of two lexica: Investigating computational pressures on word representation with neural networks.两部词典的故事:用神经网络研究单词表征的计算压力
Front Artif Intell. 2023 Mar 27;6:1062230. doi: 10.3389/frai.2023.1062230. eCollection 2023.
2
Brain-like functional specialization emerges spontaneously in deep neural networks.类脑功能特化在深度神经网络中自发出现。
Sci Adv. 2022 Mar 18;8(11):eabl8913. doi: 10.1126/sciadv.abl8913. Epub 2022 Mar 16.
3
Deep Artificial Neural Networks Reveal a Distributed Cortical Network Encoding Propositional Sentence-Level Meaning.深度人工神经网络揭示命题句级意义的分布式皮层网络编码。
J Neurosci. 2021 May 5;41(18):4100-4119. doi: 10.1523/JNEUROSCI.1152-20.2021. Epub 2021 Mar 22.
4
Increased functional connectivity in the ventral and dorsal streams during retrieval of novel words in professional musicians.在专业音乐家检索新单词时,腹侧和背侧流的功能连接增加。
Hum Brain Mapp. 2018 Feb;39(2):722-734. doi: 10.1002/hbm.23877. Epub 2017 Nov 3.
5
Neural representation of phonological wordform in bilateral posterior temporal cortex.双侧颞叶后部皮质中语音词形的神经表征
bioRxiv. 2023 Jul 21:2023.07.19.549751. doi: 10.1101/2023.07.19.549751.
6
Heteromodal Cortical Areas Encode Sensory-Motor Features of Word Meaning.异模态皮层区域编码词义的感觉运动特征。
J Neurosci. 2016 Sep 21;36(38):9763-9. doi: 10.1523/JNEUROSCI.4095-15.2016.
7
Neural dynamics underlying the acquisition of distinct auditory category structures.不同听觉类别结构习得的神经动力学基础。
Neuroimage. 2021 Dec 1;244:118565. doi: 10.1016/j.neuroimage.2021.118565. Epub 2021 Sep 17.
8
Neural representation of phonological wordform in temporal cortex.颞叶皮质中语音词形的神经表征。
Psychon Bull Rev. 2024 Dec;31(6):2659-2671. doi: 10.3758/s13423-024-02511-6. Epub 2024 Apr 30.
9
Recognizing clinical entities in hospital discharge summaries using Structural Support Vector Machines with word representation features.使用带有词表示特征的结构支持向量机识别医院出院小结中的临床实体。
BMC Med Inform Decis Mak. 2013;13 Suppl 1(Suppl 1):S1. doi: 10.1186/1472-6947-13-S1-S1. Epub 2013 Apr 5.
10
Neural learning rules for generating flexible predictions and computing the successor representation.用于生成灵活预测和计算后继表示的神经学习规则。
Elife. 2023 Mar 16;12:e80680. doi: 10.7554/eLife.80680.

本文引用的文献

1
CNNs reveal the computational implausibility of the expertise hypothesis.卷积神经网络揭示了专业技能假设在计算上的不合理性。
iScience. 2023 Jan 14;26(2):105976. doi: 10.1016/j.isci.2023.105976. eCollection 2023 Feb 17.
2
Using artificial neural networks to ask 'why' questions of minds and brains.利用人工神经网络对思维和大脑提出“为什么”的问题。
Trends Neurosci. 2023 Mar;46(3):240-254. doi: 10.1016/j.tins.2022.12.008. Epub 2023 Jan 17.
3
A neural population selective for song in human auditory cortex.人类听觉皮层中对歌声具有选择性的神经群体。
Curr Biol. 2022 Mar 28;32(6):1454-1455. doi: 10.1016/j.cub.2022.03.016.
4
Brain-like functional specialization emerges spontaneously in deep neural networks.类脑功能特化在深度神经网络中自发出现。
Sci Adv. 2022 Mar 18;8(11):eabl8913. doi: 10.1126/sciadv.abl8913. Epub 2022 Mar 16.
5
A neural population selective for song in human auditory cortex.人类听觉皮层中对歌曲具有选择性的神经群体。
Curr Biol. 2022 Apr 11;32(7):1470-1484.e12. doi: 10.1016/j.cub.2022.01.069. Epub 2022 Feb 22.
6
Brains and algorithms partially converge in natural language processing.大脑和算法在自然语言处理中部分融合。
Commun Biol. 2022 Feb 16;5(1):134. doi: 10.1038/s42003-022-03036-1.
7
Speech Computations of the Human Superior Temporal Gyrus.人类上颞叶的言语计算。
Annu Rev Psychol. 2022 Jan 4;73:79-102. doi: 10.1146/annurev-psych-022321-035256. Epub 2021 Oct 21.
8
Word meaning in minds and machines.思维与机器中的词义。
Psychol Rev. 2023 Mar;130(2):401-431. doi: 10.1037/rev0000297. Epub 2021 Jul 22.
9
Controversial stimuli: Pitting neural networks against each other as models of human cognition.有争议的刺激:将神经网络作为人类认知模型进行相互竞争。
Proc Natl Acad Sci U S A. 2020 Nov 24;117(47):29330-29337. doi: 10.1073/pnas.1912334117.
10
If deep learning is the answer, what is the question?如果深度学习是答案,那么问题是什么?
Nat Rev Neurosci. 2021 Jan;22(1):55-67. doi: 10.1038/s41583-020-00395-8. Epub 2020 Nov 16.