Des Marchais J E, Vu N V
Université de Sherbrooke Faculty of Medicine, Quebec, Canada.
Acad Med. 1996 Mar;71(3):274-83. doi: 10.1097/00001888-199603000-00021.
Students' learning was used as an outcome measure in the first phases of the major curriculum reform started in 1987 by the Université de Sherbrooke Faculty of Medicine, which shifted from a traditional to a student-centered, problem-based learning (PBL) and community-oriented program. The system for evaluating preclinical students' learning is intended to reinforce the integration of basic and clinical sciences.
To discover whether the evaluation system was fulfilling its intended goals, the authors used data from the classes of 1991-1993 to assess the reliability and validity of three evaluation instruments. The three instruments were (1) written examinations composed of multiple-choice questions (MCQs), short-answer questions (SAQs), and problem-analysis questions (PAQs); (2) PBL tutor rating forms that evaluate students' reasoning skills, communication and group-interaction skills, and autonomy and humanism; and (3) clinical skills evaluations, including objective structured clinical examinations (OSCEs). The weights allocated to the instruments reflected how the faculty valued each evaluation dimension in each of the three phases of the preclinical curriculum.
Reliability indexes improved throughout the system implementation. The written examinations proved to have content validity according to the PBL learning objectives. As evaluated by students, the PAQs were found to be at a taxonomic level that assessed ability to analyze information a third of the time in the first year of implementation of the PBL curriculum and 17% in the second year. Variations and correlations of students' mean performances across instructional units and between the evaluation instruments led to the development of a student longitudinal performance profile to be used before yearly promotion decisions are proposed. The profile was introduced in the fifth year of PBL implementation.
The system allows students to learn higher-taxonomic-level content and fulfills the institution's social responsibility of judging program outcomes and promoting qualified students, although evaluation by PBL tutors is still psychometrically questionable and the measurement of students' reasoning and ability to analyze problems is still an unfinished evaluation task.
1987年舍布鲁克大学医学院启动了重大课程改革的第一阶段,将学生的学习作为一项成果衡量指标。该改革从传统课程转向以学生为中心、基于问题的学习(PBL)和社区导向型课程。评估临床前学生学习情况的系统旨在加强基础科学与临床科学的整合。
为了探究评估系统是否实现了预期目标,作者使用了1991 - 1993级学生的数据来评估三种评估工具的信度和效度。这三种工具分别是:(1)由多项选择题(MCQ)、简答题(SAQ)和问题分析题(PAQ)组成的笔试;(2)评估学生推理能力、沟通与小组互动能力以及自主性和人文精神的PBL导师评分表;(3)临床技能评估,包括客观结构化临床考试(OSCE)。分配给这些工具的权重反映了教师在临床前课程三个阶段中对每个评估维度的重视程度。
在系统实施过程中,信度指标有所提高。根据PBL学习目标,笔试被证明具有内容效度。学生评估发现,在PBL课程实施的第一年,PAQ有三分之一的时间处于评估信息分析能力的分类水平,第二年为17%。学生在各教学单元的平均成绩以及评估工具之间的差异和相关性,促使开发了一份学生纵向成绩档案,用于在提出年度晋升决定之前使用。该档案在PBL实施的第五年引入。
该系统使学生能够学习更高分类水平的内容,并履行了机构评判课程成果和促进合格学生晋升的社会责任,尽管PBL导师的评估在心理测量学上仍存在问题,且对学生推理和分析问题能力的测量仍是一项未完成的评估任务。