Division of Science and Mathematics Education, Michigan State University, East Lansing, MI 48824, USA.
CBE Life Sci Educ. 2011 Summer;10(2):149-55. doi: 10.1187/cbe.11-03-0019.
Concept inventories, consisting of multiple-choice questions designed around common student misconceptions, are designed to reveal student thinking. However, students often have complex, heterogeneous ideas about scientific concepts. Constructed-response assessments, in which students must create their own answer, may better reveal students' thinking, but are time- and resource-intensive to evaluate. This report describes the initial meeting of a National Science Foundation-funded cross-institutional collaboration of interdisciplinary science, technology, engineering, and mathematics (STEM) education researchers interested in exploring the use of automated text analysis to evaluate constructed-response assessments. Participants at the meeting shared existing work on lexical analysis and concept inventories, participated in technology demonstrations and workshops, and discussed research goals. We are seeking interested collaborators to join our research community.
概念清单由围绕常见学生误解设计的多项选择题组成,旨在揭示学生的思维。然而,学生对科学概念往往有复杂、多样的想法。在需要学生自己创造答案的建构反应评估中,可能会更好地揭示学生的思维,但评估起来既费时又费资源。本报告描述了一个由美国国家科学基金会资助的跨机构合作的初步会议,该合作由对探索使用自动化文本分析来评估建构反应评估感兴趣的跨学科科学、技术、工程和数学(STEM)教育研究人员组成。会议参与者分享了关于词汇分析和概念清单的现有工作,参加了技术演示和研讨会,并讨论了研究目标。我们正在寻找有兴趣的合作者加入我们的研究社区。