Department of Sociology, University of Munich (LMU), Munich, Germany.
PLoS One. 2023 Aug 7;18(8):e0289380. doi: 10.1371/journal.pone.0289380. eCollection 2023.
Transparency and peer control are cornerstones of good scientific practice and entail the replication and reproduction of findings. The feasibility of replications, however, hinges on the premise that original researchers make their data and research code publicly available. This applies in particular to large-N observational studies, where analysis code is complex and may involve several ambiguous analytical decisions. To investigate which specific factors influence researchers' code sharing behavior upon request, we emailed code requests to 1,206 authors who published research articles based on data from the European Social Survey between 2015 and 2020. In this preregistered multifactorial field experiment, we randomly varied three aspects of our code request's wording in a 2x4x2 factorial design: the overall framing of our request (enhancement of social science research, response to replication crisis), the appeal why researchers should share their code (FAIR principles, academic altruism, prospect of citation, no information), and the perceived effort associated with code sharing (no code cleaning required, no information). Overall, 37.5% of successfully contacted authors supplied their analysis code. Of our experimental treatments, only framing affected researchers' code sharing behavior, though in the opposite direction we expected: Scientists who received the negative wording alluding to the replication crisis were more likely to share their research code. Taken together, our results highlight that the availability of research code will hardly be enhanced by small-scale individual interventions but instead requires large-scale institutional norms.
透明度和同行监督是良好科学实践的基石,需要对研究结果进行复制和再现。然而,可重复性的前提是原始研究人员公开其数据和研究代码。这尤其适用于大 N 观察性研究,其中分析代码复杂,并且可能涉及几个模糊的分析决策。为了研究哪些具体因素会影响研究人员根据请求共享代码的行为,我们向 1206 名作者发送了代码请求,这些作者根据 2015 年至 2020 年期间的欧洲社会调查数据发表了研究文章。在这个预先注册的多因素现场实验中,我们在 2x4x2 析因设计中随机改变了我们代码请求措辞的三个方面:请求的总体框架(增强社会科学研究,回应复制危机)、呼吁研究人员共享代码的原因(FAIR 原则、学术利他主义、引用前景、无信息)以及与代码共享相关的感知工作量(无需代码清理,无信息)。总的来说,成功联系到的作者中有 37.5%提供了他们的分析代码。在我们的实验处理中,只有框架影响了研究人员的代码共享行为,尽管与我们预期的方向相反:收到暗示复制危机的负面措辞的科学家更有可能共享他们的研究代码。总的来说,我们的研究结果表明,通过小规模的个体干预几乎不可能提高研究代码的可用性,而是需要大规模的机构规范。