• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于自然语言的增量贝叶斯类别学习

Incremental Bayesian Category Learning From Natural Language.

作者信息

Frermann Lea, Lapata Mirella

机构信息

Institute for Language, Cognition and Computation, School of Informatics, University of Edinburgh.

出版信息

Cogn Sci. 2016 Aug;40(6):1333-81. doi: 10.1111/cogs.12304. Epub 2015 Nov 2.

DOI:10.1111/cogs.12304
PMID:26534863
Abstract

Models of category learning have been extensively studied in cognitive science and primarily tested on perceptual abstractions or artificial stimuli. In this paper, we focus on categories acquired from natural language stimuli, that is, words (e.g., chair is a member of the furniture category). We present a Bayesian model that, unlike previous work, learns both categories and their features in a single process. We model category induction as two interrelated subproblems: (a) the acquisition of features that discriminate among categories, and (b) the grouping of concepts into categories based on those features. Our model learns categories incrementally using particle filters, a sequential Monte Carlo method commonly used for approximate probabilistic inference that sequentially integrates newly observed data and can be viewed as a plausible mechanism for human learning. Experimental results show that our incremental learner obtains meaningful categories which yield a closer fit to behavioral data compared to related models while at the same time acquiring features which characterize the learned categories. (An earlier version of this work was published in Frermann and Lapata .).

摘要

类别学习模型在认知科学中已得到广泛研究,主要在知觉抽象或人工刺激上进行测试。在本文中,我们关注从自然语言刺激(即单词)中获取的类别(例如,椅子是家具类别的一员)。我们提出了一种贝叶斯模型,与之前的工作不同,该模型在单个过程中学习类别及其特征。我们将类别归纳建模为两个相互关联的子问题:(a)获取区分不同类别的特征,以及(b)基于这些特征将概念分组为类别。我们的模型使用粒子滤波器增量式地学习类别,粒子滤波器是一种常用于近似概率推理的序贯蒙特卡罗方法,它顺序整合新观察到的数据,并且可以被视为人类学习的一种合理机制。实验结果表明,与相关模型相比,我们的增量式学习者获得了有意义的类别,这些类别与行为数据的拟合度更高,同时还获取了表征所学类别的特征。(这项工作的早期版本发表在弗雷曼和拉帕塔的论文中。)

相似文献

1
Incremental Bayesian Category Learning From Natural Language.基于自然语言的增量贝叶斯类别学习
Cogn Sci. 2016 Aug;40(6):1333-81. doi: 10.1111/cogs.12304. Epub 2015 Nov 2.
2
Learning overhypotheses with hierarchical Bayesian models.使用分层贝叶斯模型学习过度假设。
Dev Sci. 2007 May;10(3):307-21. doi: 10.1111/j.1467-7687.2007.00585.x.
3
Adding sentence types to a model of syntactic category acquisition.为句法范畴习得模型添加句子类型。
Top Cogn Sci. 2013 Jul;5(3):495-521. doi: 10.1111/tops.12030. Epub 2013 Jun 7.
4
A role for the developing lexicon in phonetic category acquisition.发展中的词汇在语音范畴习得中的作用。
Psychol Rev. 2013 Oct;120(4):751-78. doi: 10.1037/a0034245.
5
Learning abstract visual concepts via probabilistic program induction in a Language of Thought.通过思维语言中的概率程序归纳学习抽象视觉概念。
Cognition. 2017 Nov;168:320-334. doi: 10.1016/j.cognition.2017.07.005. Epub 2017 Aug 1.
6
Bootstrapping language acquisition.引导式语言习得
Cognition. 2017 Jul;164:116-143. doi: 10.1016/j.cognition.2017.02.009. Epub 2017 Apr 13.
7
Contrast effects in typicality judgements: a hierarchical Bayesian approach.典型性判断中的对比效应:一种分层贝叶斯方法。
Q J Exp Psychol (Hove). 2012;65(9):1721-39. doi: 10.1080/17470218.2012.662237. Epub 2012 Apr 27.
8
Why Higher Working Memory Capacity May Help You Learn: Sampling, Search, and Degrees of Approximation.为何更高的工作记忆容量可能有助于学习:抽样、搜索与近似程度。
Cogn Sci. 2019 Dec;43(12):e12805. doi: 10.1111/cogs.12805.
9
A probabilistic model of cross-categorization.跨范畴分类的概率模型。
Cognition. 2011 Jul;120(1):1-25. doi: 10.1016/j.cognition.2011.02.010. Epub 2011 Mar 4.
10
Uncovering contrast categories in categorization with a probabilistic threshold model.用概率阈模型揭示分类中的对比类别。
J Exp Psychol Learn Mem Cogn. 2011 Nov;37(6):1515-31. doi: 10.1037/a0024431. Epub 2011 Jul 18.

引用本文的文献

1
No frills: Simple regularities in language can go a long way in the development of word knowledge.无花饰:语言中的简单规律在词汇知识的发展中大有可为。
Dev Sci. 2023 Jul;26(4):e13373. doi: 10.1111/desc.13373. Epub 2023 Feb 5.
2
Exposure to co-occurrence regularities in language drives semantic integration of new words.接触语言中的共现规律会促进新单词的语义整合。
J Exp Psychol Learn Mem Cogn. 2022 Jul;48(7):1064-1081. doi: 10.1037/xlm0001122. Epub 2022 Apr 7.
3
The Emergence of Richly Organized Semantic Knowledge from Simple Statistics: A Synthetic Review.
从简单统计中涌现出丰富组织的语义知识:综合综述。
Dev Rev. 2021 Jun;60. doi: 10.1016/j.dr.2021.100949. Epub 2021 Mar 3.
4
Statistical regularities shape semantic organization throughout development.统计规律在整个发育过程中塑造语义组织。
Cognition. 2020 May;198:104190. doi: 10.1016/j.cognition.2020.104190. Epub 2020 Feb 1.