Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America.
Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America.
PLoS One. 2022 Aug 30;17(8):e0273337. doi: 10.1371/journal.pone.0273337. eCollection 2022.
Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students' critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.
批判性思维是人们决定信任什么和做什么的过程。许多本科课程,如生物学和物理学,都将批判性思维作为一个重要的学习目标。然而,评估批判性思维并非易事,对于如何将批判性思维作为教学的一部分进行评估,建议不一。在这里,我们评估了评估问题在生物学和物理学背景下探测学生批判性思维技能的效果。我们使用了两种基于研究的标准化批判性思维工具,即生物学实验室生态批判性思维清单(Eco-BLIC)和物理实验室批判性思维清单(PLIC)。这些工具提供了实验场景,并提出了问题,要求学生评估实验设计和数据的质量,从而决定信任什么和做什么。使用来自 20 多所学校的 3000 多名学生的回答,我们试图了解评估问题的哪些特征可以激发学生的批判性思维。具体来说,我们调查了(a)学生在个体评估一项研究与比较和对比两项研究时,如何批判性地评估生物学和物理学中研究的各个方面,以及(b)在进行比较和对比时,是否需要单独的评估问题来鼓励学生进行批判性思维。我们发现,学生在进行两项研究之间的比较时,批判性更强,而不是在单独评估每项研究时。此外,比较和对比问题足以引发批判性思维,学生提供的答案相似,无论是否包含单独的评估问题。这项研究为在本科基础阶段引发批判性思维的评估问题类型提供了新的见解;具体来说,我们建议教师在课程和评估中纳入更多与实验设计相关的比较和对比问题。