Hall Joshua D, O'Connell Anna B, Cook Jeanette G
Office of Graduate Education, University of North Carolina School of Medicine, Chapel Hill, NC, United States of America.
Department of Biochemistry and Biophysics, University of North Carolina School of Medicine, Chapel Hill, NC, United States of America.
PLoS One. 2017 Jan 11;12(1):e0169121. doi: 10.1371/journal.pone.0169121. eCollection 2017.
Many US biomedical PhD programs receive more applications for admissions than they can accept each year, necessitating a selective admissions process. Typical selection criteria include standardized test scores, undergraduate grade point average, letters of recommendation, a resume and/or personal statement highlighting relevant research or professional experience, and feedback from interviews with training faculty. Admissions decisions are often founded on assumptions that these application components correlate with research success in graduate school, but these assumptions have not been rigorously tested. We sought to determine if any application components were predictive of student productivity measured by first-author student publications and time to degree completion. We collected productivity metrics for graduate students who entered the umbrella first-year biomedical PhD program at the University of North Carolina at Chapel Hill from 2008-2010 and analyzed components of their admissions applications. We found no correlations of test scores, grades, amount of previous research experience, or faculty interview ratings with high or low productivity among those applicants who were admitted and chose to matriculate at UNC. In contrast, ratings from recommendation letter writers were significantly stronger for students who published multiple first-author papers in graduate school than for those who published no first-author papers during the same timeframe. We conclude that the most commonly used standardized test (the general GRE) is a particularly ineffective predictive tool, but that qualitative assessments by previous mentors are more likely to identify students who will succeed in biomedical graduate research. Based on these results, we conclude that admissions committees should avoid over-reliance on any single component of the application and de-emphasize metrics that are minimally predictive of student productivity. We recommend continual tracking of desired training outcomes combined with retrospective analysis of admissions practices to guide both application requirements and holistic application review.
美国许多生物医学博士项目每年收到的入学申请数量超过了其可录取人数,因此需要进行选拔性的招生流程。典型的选拔标准包括标准化考试成绩、本科平均绩点、推荐信、一份突出相关研究或专业经验的简历和/或个人陈述,以及与培训教员面试的反馈。招生决定通常基于这样的假设,即这些申请材料与研究生阶段的研究成功相关,但这些假设尚未经过严格检验。我们试图确定是否有任何申请材料能够预测以学生第一作者发表论文数量和完成学位所需时间衡量的学生产出。我们收集了2008年至2010年进入北卡罗来纳大学教堂山分校综合第一年生物医学博士项目的研究生的产出指标,并分析了他们入学申请的各项材料。我们发现,在那些被北卡罗来纳大学录取并选择入学的申请者中,考试成绩、本科成绩、以往研究经验的多少或教员面试评分与高或低产出之间没有相关性。相比之下,与在同一时间段内未发表第一作者论文的学生相比,在研究生阶段发表多篇第一作者论文的学生从推荐信撰写人那里得到的评分明显更高。我们得出结论,最常用的标准化考试(普通GRE)是一种特别无效的预测工具,但以往导师的定性评估更有可能识别出在生物医学研究生研究中会取得成功的学生。基于这些结果,我们得出结论,招生委员会应避免过度依赖申请的任何单一组成部分,并淡化那些对学生产出预测性极小的指标。我们建议持续跟踪期望的培训成果,并结合对招生实践的回顾性分析,以指导申请要求和全面的申请审核。