Suppr超能文献

利用上下文构建语义。

Using context to build semantics.

作者信息

Kwantes Peter J

机构信息

Defence R&D Canada, Toronto, Ontario, Canada.

出版信息

Psychon Bull Rev. 2005 Aug;12(4):703-10. doi: 10.3758/bf03196761.

Abstract

Latent semantic analysis (LSA) is a model of knowledge representation for words. It works by applying dimension reduction to local co-occurrence data from a large collection of documents after performing singular value decomposition on it. When the reduction is applied, the system forms condensed representations for the words that incorporate higher order associations. The higher order associations are primarily responsible for any semantic similarity between words in LSA. In this article, a memory model is described that creates semantic representations for words that are similar in form to those created by LSA. However, instead of applying dimension reduction, the model builds the representations by using a retrieval mechanism from a well-known account of episodic memory.

摘要

潜在语义分析(LSA)是一种词语知识表示模型。它的工作方式是在对来自大量文档集合的局部共现数据进行奇异值分解后,对其应用降维。当应用降维时,系统会为词语形成包含高阶关联的浓缩表示。高阶关联主要负责LSA中词语之间的任何语义相似性。在本文中,描述了一种记忆模型,该模型为词语创建的语义表示在形式上与LSA创建的相似。然而,该模型不是应用降维,而是通过使用来自著名情景记忆理论的检索机制来构建表示。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验