• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

循环神经网络对逻辑词的表示学习:从单词序列到机器人动作

Representation Learning of Logic Words by an RNN: From Word Sequences to Robot Actions.

作者信息

Yamada Tatsuro, Murata Shingo, Arie Hiroaki, Ogata Tetsuya

机构信息

Department of Intermedia Art and Science, Waseda University, Tokyo, Japan.

Department of Modern Mechanical Engineering, Waseda University, Tokyo, Japan.

出版信息

Front Neurorobot. 2017 Dec 22;11:70. doi: 10.3389/fnbot.2017.00070. eCollection 2017.

DOI:10.3389/fnbot.2017.00070
PMID:29311891
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5744442/
Abstract

An important characteristic of human language is compositionality. We can efficiently express a wide variety of real-world situations, events, and behaviors by compositionally constructing the meaning of a complex expression from a finite number of elements. Previous studies have analyzed how machine-learning models, particularly neural networks, can learn from experience to represent compositional relationships between language and robot actions with the aim of understanding the symbol grounding structure and achieving intelligent communicative agents. Such studies have mainly dealt with the words (nouns, adjectives, and verbs) that directly refer to real-world matters. In addition to these words, the current study deals with logic words, such as "not," "and," and "or" simultaneously. These words are not directly referring to the real world, but are logical operators that contribute to the construction of meaning in sentences. In human-robot communication, these words may be used often. The current study builds a recurrent neural network model with long short-term memory units and trains it to learn to translate sentences including logic words into robot actions. We investigate what kind of compositional representations, which mediate sentences and robot actions, emerge as the network's internal states via the learning process. Analysis after learning shows that referential words are merged with visual information and the robot's own current state, and the logical words are represented by the model in accordance with their functions as logical operators. Words such as "true," "false," and "not" work as non-linear transformations to encode orthogonal phrases into the same area in a memory cell state space. The word "and," which required a robot to lift up both its hands, worked as if it was a universal quantifier. The word "or," which required action generation that looked apparently random, was represented as an unstable space of the network's dynamical system.

摘要

人类语言的一个重要特征是组合性。我们可以通过从有限数量的元素中组合构建复杂表达式的含义,来有效地表达各种各样的现实世界情况、事件和行为。先前的研究分析了机器学习模型,特别是神经网络,如何从经验中学习以表示语言与机器人动作之间的组合关系,目的是理解符号基础结构并实现智能通信代理。此类研究主要处理直接指代现实世界事物的单词(名词、形容词和动词)。除了这些单词之外,本研究还同时处理逻辑词,如“not”“and”和“or”。这些词并不直接指代现实世界,而是有助于句子意义构建的逻辑运算符。在人机通信中,这些词可能会经常被使用。本研究构建了一个带有长短期记忆单元的循环神经网络模型,并对其进行训练,使其学会将包含逻辑词的句子翻译成机器人动作。我们研究在学习过程中,作为网络内部状态出现的、介导句子和机器人动作的是何种组合表示。学习后的分析表明,指代性单词与视觉信息以及机器人自身的当前状态相融合,并且逻辑词由模型根据其作为逻辑运算符的功能来表示。诸如“true”“false”和“not”之类的词起到非线性变换的作用,将正交短语编码到记忆单元状态空间中的同一区域。要求机器人举起双手的“and”这个词,其作用就好像是一个全称量词。要求产生看似随机动作的“or”这个词,被表示为网络动态系统的一个不稳定空间。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/ae009c51009c/fnbot-11-00070-g0015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/a6f1a9942d08/fnbot-11-00070-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/86661e086740/fnbot-11-00070-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/c62ac0bfe8b1/fnbot-11-00070-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/217008742e65/fnbot-11-00070-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/bc41bde2a855/fnbot-11-00070-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/d55fc0d10544/fnbot-11-00070-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/f230822e7ce7/fnbot-11-00070-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/3efc37f8a7d0/fnbot-11-00070-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/8ed4755f2038/fnbot-11-00070-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/cd70ce4a4a65/fnbot-11-00070-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/f3981a24b028/fnbot-11-00070-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/d781b0776a3b/fnbot-11-00070-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/26194e5a5bb3/fnbot-11-00070-g0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/7cbdc7b28196/fnbot-11-00070-g0014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/ae009c51009c/fnbot-11-00070-g0015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/a6f1a9942d08/fnbot-11-00070-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/86661e086740/fnbot-11-00070-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/c62ac0bfe8b1/fnbot-11-00070-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/217008742e65/fnbot-11-00070-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/bc41bde2a855/fnbot-11-00070-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/d55fc0d10544/fnbot-11-00070-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/f230822e7ce7/fnbot-11-00070-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/3efc37f8a7d0/fnbot-11-00070-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/8ed4755f2038/fnbot-11-00070-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/cd70ce4a4a65/fnbot-11-00070-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/f3981a24b028/fnbot-11-00070-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/d781b0776a3b/fnbot-11-00070-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/26194e5a5bb3/fnbot-11-00070-g0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/7cbdc7b28196/fnbot-11-00070-g0014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8e03/5744442/ae009c51009c/fnbot-11-00070-g0015.jpg

相似文献

1
Representation Learning of Logic Words by an RNN: From Word Sequences to Robot Actions.循环神经网络对逻辑词的表示学习:从单词序列到机器人动作
Front Neurorobot. 2017 Dec 22;11:70. doi: 10.3389/fnbot.2017.00070. eCollection 2017.
2
The logic in language: How all quantifiers are alike, but each quantifier is different.语言中的逻辑:所有量词如何相似,但每个量词又如何不同。
Cogn Psychol. 2016 Jun;87:29-52. doi: 10.1016/j.cogpsych.2016.04.002. Epub 2016 May 28.
3
A Cognitive Neural Architecture Able to Learn and Communicate through Natural Language.一种能够通过自然语言进行学习和交流的认知神经架构。
PLoS One. 2015 Nov 11;10(11):e0140866. doi: 10.1371/journal.pone.0140866. eCollection 2015.
4
Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture.使用基于现实世界的具身认知架构从自然语言指令中学习动作
Front Neurorobot. 2021 May 13;15:626380. doi: 10.3389/fnbot.2021.626380. eCollection 2021.
5
Dynamical Integration of Language and Behavior in a Recurrent Neural Network for Human-Robot Interaction.用于人机交互的循环神经网络中语言与行为的动态整合
Front Neurorobot. 2016 Jul 15;10:5. doi: 10.3389/fnbot.2016.00005. eCollection 2016.
6
Grounding Action Words in the Sensorimotor Interaction with the World: Experiments with a Simulated iCub Humanoid Robot.将动作词与与世界的感知运动交互联系起来:使用模拟的 iCub 人形机器人进行的实验。
Front Neurorobot. 2010 May 31;4. doi: 10.3389/fnbot.2010.00007. eCollection 2010.
7
Cross-Situational Learning with Bayesian Generative Models for Multimodal Category and Word Learning in Robots.基于贝叶斯生成模型的跨情境学习在机器人多模态类别与单词学习中的应用
Front Neurorobot. 2017 Dec 19;11:66. doi: 10.3389/fnbot.2017.00066. eCollection 2017.
8
Learning the Meanings of Function Words From Grounded Language Using a Visual Question Answering Model.基于视觉问答模型从基础语言中学习功能词的含义。
Cogn Sci. 2024 May;48(5):e13448. doi: 10.1111/cogs.13448.
9
Bootstrapping language acquisition.引导式语言习得
Cognition. 2017 Jul;164:116-143. doi: 10.1016/j.cognition.2017.02.009. Epub 2017 Apr 13.
10
Exploring the acquisition and production of grammatical constructions through human-robot interaction with echo state networks.通过与回声状态网络的人机交互探索语法结构的习得和生成。
Front Neurorobot. 2014 May 6;8:16. doi: 10.3389/fnbot.2014.00016. eCollection 2014.

引用本文的文献

1
Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture.使用基于现实世界的具身认知架构从自然语言指令中学习动作
Front Neurorobot. 2021 May 13;15:626380. doi: 10.3389/fnbot.2021.626380. eCollection 2021.
2
Crossmodal Language Grounding in an Embodied Neurocognitive Model.具身神经认知模型中的跨模态语言基础
Front Neurorobot. 2020 Oct 14;14:52. doi: 10.3389/fnbot.2020.00052. eCollection 2020.

本文引用的文献

1
Dynamical Integration of Language and Behavior in a Recurrent Neural Network for Human-Robot Interaction.用于人机交互的循环神经网络中语言与行为的动态整合
Front Neurorobot. 2016 Jul 15;10:5. doi: 10.3389/fnbot.2016.00005. eCollection 2016.
2
Exploring the acquisition and production of grammatical constructions through human-robot interaction with echo state networks.通过与回声状态网络的人机交互探索语法结构的习得和生成。
Front Neurorobot. 2014 May 6;8:16. doi: 10.3389/fnbot.2014.00016. eCollection 2014.
3
Toward a self-organizing pre-symbolic neural model representing sensorimotor primitives.
迈向一个表示感觉运动原语的自组织前符号神经模型。
Front Behav Neurosci. 2014 Feb 4;8:22. doi: 10.3389/fnbeh.2014.00022. eCollection 2014.
4
The grounding of higher order concepts in action and language: a cognitive robotics model.将高阶概念根植于行动和语言中:一个认知机器人模型。
Neural Netw. 2012 Aug;32:165-73. doi: 10.1016/j.neunet.2012.02.012. Epub 2012 Feb 14.
5
A neurodynamic account of spontaneous behaviour.自发性行为的神经动力学解释。
PLoS Comput Biol. 2011 Oct;7(10):e1002221. doi: 10.1371/journal.pcbi.1002221. Epub 2011 Oct 20.
6
Emergence of hierarchical structure mirroring linguistic composition in a recurrent neural network.递归神经网络中与语言构成相呼应的层次结构的出现。
Neural Netw. 2011 May;24(4):311-20. doi: 10.1016/j.neunet.2010.12.006. Epub 2011 Jan 12.
7
Natural language from artificial life.来自人工生命的自然语言。
Artif Life. 2002;8(2):185-215. doi: 10.1162/106454602320184248.
8
Long short-term memory.长短期记忆
Neural Comput. 1997 Nov 15;9(8):1735-80. doi: 10.1162/neco.1997.9.8.1735.