Department of Plant Systematics, Ecology and Theoretical Biology, Eötvös Loránd University of Sciences, Budapest, Hungary.
J Exp Psychol Learn Mem Cogn. 2012 May;38(3):776-82. doi: 10.1037/a0026986. Epub 2012 Jan 23.
Center-embedded recursion (CER) in natural language is exemplified by sentences such as "The malt that the rat ate lay in the house." Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by means of artificial grammar learning (AGL) tasks, during which participants have to infer the rule from a set of artificial sentences. One of the surprising results of previous AGL experiments is that learning CER is not as easy as had been thought. We hypothesized that because artificial sentences lack semantic content, semantics could help humans learn the syntax of center-embedded sentences. To test this, we composed sentences from 4 vocabularies of different degrees of semantic content due to 3 factors (familiarity, meaning of words, and semantic relationship between words). According to our results, these factors have no effect one by one but they make learning significantly faster when combined. This leads to the assumption that there were different mechanisms at work when CER was parsed in natural and in artificial languages. This finding questions the suitability of AGL tasks with artificial vocabularies for studying the learning and processing of linguistic CER.
自然语言中的中心嵌入递归(CER)的例子有“老鼠吃的麦芽在房子里。”解析中心嵌入结构是关注的焦点,因为这可能是人类与所有其他动物区别开来的认知能力之一。解析 CER 的能力通常通过人工语法学习(AGL)任务来测试,在此期间,参与者必须从一组人工句子中推断出规则。先前 AGL 实验的一个令人惊讶的结果是,学习 CER 并不像人们想象的那么容易。我们假设,因为人工句子缺乏语义内容,所以语义可以帮助人类学习中心嵌入句子的语法。为了检验这一点,我们根据三个因素(单词的熟悉程度、单词的含义和单词之间的语义关系),从四个不同语义内容程度的词汇中组成句子。根据我们的结果,这些因素单独作用时没有影响,但组合使用时会使学习速度显著加快。这使得人们假设,在自然语言和人工语言中解析 CER 时,存在不同的机制在起作用。这一发现质疑了使用人工词汇的 AGL 任务在研究语言 CER 的学习和处理方面的适用性。