Linkov Faina, Lovalekar Mita, LaPorte Ronald
Department of Epidemiology, Graduate School of Public Health, University of Pittsburgh, Pittsburgh, PA 15261, USA.
Croat Med J. 2007 Apr;48(2):249-55.
To examine the feasibility of using peer review for the quality control of online materials.
We analyzed the inter-rater agreement on the quality of epidemiological lectures online, based on the Global Health Network Supercourse lecture library. We examined the agreement among reviewers by looking at kappa statistics and intraclass correlations. Seven expert reviewers examined and rated a random sample of 100 Supercourse lectures. Their reviews were compared with the reviews of the lay Supercourse reviewers.
Both expert and non-expert reviewers rated lectures very highly, with a mean overall score of 4 out of 5. Kappa (Kappa) statistic and intraclass correlations indicated that inter-rater agreement for experts and non-experts was surprisingly low (below 0.4).
To our knowledge, this was the first time that poor inter-rater agreement was demonstrated for the Internet lectures. Future research studies need to evaluate the alternatives to the peer review system, especially for online materials.
探讨使用同行评审进行在线材料质量控制的可行性。
我们基于全球健康网络超级课程讲座库,分析了在线流行病学讲座质量的评分者间一致性。我们通过查看kappa统计量和组内相关性来检验评审者之间的一致性。七位专家评审员对100个超级课程讲座的随机样本进行了检查和评分。他们的评审结果与普通超级课程评审员的评审结果进行了比较。
专家评审员和非专家评审员对讲座的评分都很高,平均总分是5分中的4分。kappa统计量和组内相关性表明,专家和非专家之间的评分者间一致性出奇地低(低于0.4)。
据我们所知,这是首次证明互联网讲座的评分者间一致性较差。未来的研究需要评估同行评审系统的替代方案,特别是对于在线材料。