Ladd Alexander, Kim Kyung Geun, Balewski Jan, Bouchard Kristofer, Ben-Shalom Roy
Electrical Engineering and Computer Sciences, University of California, Berkeley, Berkeley, CA, United States.
NERSC, Lawrence Berkeley National Laboratory, Berkeley, CA, United States.
Front Neuroinform. 2022 Jun 17;16:882552. doi: 10.3389/fninf.2022.882552. eCollection 2022.
Single neuron models are fundamental for computational modeling of the brain's neuronal networks, and understanding how ion channel dynamics mediate neural function. A challenge in defining such models is determining biophysically realistic channel distributions. Here, we present an efficient, highly parallel evolutionary algorithm for developing such models, named . uses CPUs and GPUs concurrently to simulate and evaluate neuron membrane potentials with respect to multiple stimuli. We demonstrate a logarithmic cost for scaling the stimuli used in the fitting procedure. outperforms the typically used CPU based evolutionary algorithm by a factor of 10 on a series of scaling benchmarks. We report observed performance bottlenecks and propose mitigation strategies. Finally, we also discuss the potential of this method for efficient simulation and evaluation of electrophysiological waveforms.
单神经元模型是大脑神经网络计算建模以及理解离子通道动力学如何介导神经功能的基础。定义此类模型的一个挑战是确定生物物理上现实的通道分布。在此,我们提出一种用于开发此类模型的高效、高度并行的进化算法,名为 。 同时使用CPU和GPU来模拟和评估神经元膜电位对多种刺激的响应。我们证明了在拟合过程中缩放刺激所使用的成本呈对数关系。 在一系列缩放基准测试中比典型的基于CPU的进化算法性能优10倍。我们报告观察到的性能瓶颈并提出缓解策略。最后,我们还讨论了该方法在高效模拟和评估电生理波形方面的潜力。