Machine Learning in Science, Excellence Cluster Machine Learning, University of Tübingen, Tübingen, Germany.
Technical University of Munich, Munich, Germany.
Elife. 2022 Jul 27;11:e77220. doi: 10.7554/eLife.77220.
Inferring parameters of computational models that capture experimental data are a central task in cognitive neuroscience. Bayesian statistical inference methods usually require the ability to evaluate the likelihood of the model-however, for many models of interest in cognitive neuroscience, the associated likelihoods cannot be computed efficiently. Simulation-based inference (SBI) offers a solution to this problem by only requiring access to simulations produced by the model. Previously, Fengler et al. introduced likelihood approximation networks (LANs, Fengler et al., 2021) which make it possible to apply SBI to models of decision-making, but require billions of simulations for training. Here, we provide a new SBI method that is substantially more simulation efficient. Our approach, mixed neural likelihood estimation (MNLE), trains neural density estimators on model simulations to emulate the simulator, and is designed to capture both the continuous (e.g., reaction times) and discrete (choices) data of decision-making models. The likelihoods of the emulator can then be used to perform Bayesian parameter inference on experimental data using standard approximate inference methods like Markov Chain Monte Carlo sampling. We demonstrate MNLE on two variants of the drift-diffusion model and show that it is substantially more efficient than LANs: MNLE achieves similar likelihood accuracy with six orders of magnitude fewer training simulations, and is significantly more accurate than LANs when both are trained with the same budget. Our approach enables researchers to perform SBI on custom-tailored models of decision-making, leading to fast iteration of model design for scientific discovery.
推断能够捕捉实验数据的计算模型的参数是认知神经科学的一项核心任务。贝叶斯统计推断方法通常需要评估模型的可能性——然而,对于认知神经科学中许多感兴趣的模型,相关的可能性无法有效地计算。基于模拟的推断(SBI)通过仅要求访问模型生成的模拟提供了解决此问题的方法。此前,Fengler 等人引入了似然逼近网络(LAN,Fengler 等人,2021 年),使得可以将 SBI 应用于决策模型,但训练需要数十亿次模拟。在这里,我们提供了一种新的 SBI 方法,它具有更高的模拟效率。我们的方法,混合神经似然估计(MNLE),在模型模拟上训练神经密度估计器以模拟模拟器,并旨在捕捉决策模型的连续(例如,反应时间)和离散(选择)数据。然后可以使用模拟器的似然来使用标准近似推断方法(如马尔可夫链蒙特卡罗采样)对实验数据执行贝叶斯参数推断。我们在两种漂移扩散模型变体上展示了 MNLE,并表明它比 LAN 高效得多:MNLE 仅用六个数量级的训练模拟就可以达到相似的似然准确性,并且当使用相同的预算进行训练时,它比 LAN 更准确。我们的方法使研究人员能够对定制的决策模型执行 SBI,从而快速迭代模型设计以进行科学发现。