Suppr超能文献

支架式学习:基于大型语言模型从具体到一般。

Scaffolding learning: From specific to generic with large language models.

机构信息

Lynbrook High School, San Jose, CA, United States of America.

Airbnb, San Francisco, CA, United States of America.

出版信息

PLoS One. 2024 Sep 20;19(9):e0310409. doi: 10.1371/journal.pone.0310409. eCollection 2024.

Abstract

Large language models such as ChatGPT have been shown to excel in solving complex math problems. However, they cannot solve basic arithmetic problems such as 758*639 = 484,362. This makes us ponder if LLMs have been trained to solve math and science problems in the right way. When a student learns math at school, she or he starts with arithmetic, then moves to word problems, polynomials, and calculus. Each skill she or he acquires will be used in the next stage to solve more advanced problems. In this paper we propose Scaffolding Learning for LLMs, which imitates how a student learns a subject in a step-by-step manner. For example, we first train an LLM to perform highly specific operations such as multiplication and division, and then apply such "skills" in a more generic task such as solving word problems. This is related to Curriculum Training, which trains a model on tasks following a specific order, such as training on easy tasks first and then gradually increases the difficulty. Our proposed approach goes from specific tasks to generic ones, which can be considered as a special case of Curriculum Training. Our empirical studies show that when an LLM has "mastered" a specific skill, only a small amount of training is required to teach it to apply the skill to a more generic application.

摘要

大型语言模型,如 ChatGPT,已经被证明擅长解决复杂的数学问题。然而,它们无法解决基本的算术问题,例如 758*639=484362。这让我们思考,大型语言模型是否经过了正确的训练来解决数学和科学问题。当学生在学校学习数学时,她或他从算术开始,然后过渡到应用题、多项式和微积分。她或他掌握的每一项技能都将在下一阶段用于解决更高级的问题。在本文中,我们提出了针对大型语言模型的脚手架学习(Scaffolding Learning for LLMs),它模仿了学生一步一步学习一门学科的方式。例如,我们首先训练一个大型语言模型执行高度特定的操作,例如乘法和除法,然后将这种“技能”应用于更通用的任务,例如解决应用题。这与课程训练(Curriculum Training)有关,它按照特定的顺序对模型进行训练,例如首先训练简单的任务,然后逐渐增加难度。我们提出的方法从特定任务到通用任务,这可以被认为是课程训练的一个特例。我们的实证研究表明,当一个大型语言模型“掌握”了一项特定技能后,只需要少量的训练就可以教它将该技能应用于更通用的应用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/06ca/11414939/9493e8f0da9c/pone.0310409.g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验