Department of Neurology, Hannover Medical School, Hannover, Germany.
Neuroimage. 2018 May 15;172:775-785. doi: 10.1016/j.neuroimage.2018.01.005. Epub 2018 Jan 10.
We analyzed factors that may hamper the advancement of computational cognitive neuroscience (CCN). These factors include a particular statistical mindset, which paves the way for the dominance of statistical power theory and a preoccupation with statistical replicability in the behavioral and neural sciences. Exclusive statistical concerns about sampling error occur at the cost of an inadequate representation of the problem of measurement error. We contrasted the manipulation of data quantity (sampling error, by varying the number of subjects) against the manipulation of data quality (measurement error, by varying the number of data per subject) in a simulated Bayesian model identifiability study. The results were clear-cut in showing that - across all levels of signal-to-noise ratios - varying the number of subjects was completely inconsequential, whereas the number of data per subject exerted massive effects on model identifiability. These results emphasize data quality over data quantity, and they call for the integration of statistics and measurement theory.
我们分析了可能阻碍计算认知神经科学(CCN)发展的因素。这些因素包括一种特殊的统计思维模式,它为统计功效理论的主导地位以及行为和神经科学中对统计可重复性的关注铺平了道路。对抽样误差的独家统计关注是以牺牲对测量误差问题的充分表示为代价的。我们在模拟贝叶斯模型可识别性研究中对比了数据量(通过改变被试数量来表示抽样误差)和数据质量(通过改变每个被试的数据数量来表示测量误差)的操纵。结果清楚地表明——在所有信噪比水平下——改变被试数量完全没有影响,而每个被试的数据数量对模型可识别性产生了巨大影响。这些结果强调了数据质量而不是数据数量,并呼吁统计学和测量理论的融合。