Auburn University, Department of Physics, Auburn University, Auburn, Alabama.
Auburn University, Department of Physics, Auburn University, Auburn, Alabama.
Biophys J. 2023 Jul 25;122(14):2833-2840. doi: 10.1016/j.bpj.2023.01.042. Epub 2023 Feb 3.
Over a century ago, physicists started broadly relying on theoretical models to guide new experiments. Soon thereafter, chemists began doing the same. Now, biological research enters a new era when experiment and theory walk hand in hand. Novel software and specialized hardware became essential to understand experimental data and propose new models. In fact, current petascale computing resources already allow researchers to reach unprecedented levels of simulation throughput to connect in silico and in vitro experiments. The reduction in cost and improved access allowed a large number of research groups to adopt supercomputing resources and techniques. Here, we outline how large-scale computing has evolved to expand decades-old research, spark new research efforts, and continuously connect simulation and observation. For instance, multiple publicly and privately funded groups have dedicated extensive resources to develop artificial intelligence tools for computational biophysics, from accelerating quantum chemistry calculations to proposing protein structure models. Moreover, advances in computer hardware have accelerated data processing from single-molecule experimental observations and simulations of chemical reactions occurring throughout entire cells. The combination of software and hardware has opened the way for exascale computing and the production of the first public exascale supercomputer, Frontier, inaugurated by the Oak Ridge National Laboratory in 2022. Ultimately, the popularization and development of computational techniques and the training of researchers to use them will only accelerate the diversification of tools and learning resources for future generations.
一个多世纪以前,物理学家开始广泛依赖理论模型来指导新的实验。此后不久,化学家也开始这样做。现在,生物学研究进入了一个新时代,实验和理论携手并进。新型软件和专门的硬件成为理解实验数据和提出新模型的必要条件。事实上,当前的每秒千万亿次计算资源已经允许研究人员达到前所未有的模拟通量水平,以连接计算机模拟实验和体外实验。成本的降低和访问权限的提高使得大量研究小组能够采用超级计算资源和技术。在这里,我们概述了大规模计算是如何发展的,以扩展数十年的研究,激发新的研究努力,并不断连接模拟和观察。例如,多个公共和私人资助的团体投入了大量资源来开发计算生物物理学的人工智能工具,从加速量子化学计算到提出蛋白质结构模型。此外,计算机硬件的进步加速了从单个分子实验观测到整个细胞中化学反应模拟的数据处理。软件和硬件的结合为 exascale 计算和第一台公共 exascale 超级计算机 Frontier 的诞生铺平了道路,该超级计算机由橡树岭国家实验室于 2022 年推出。最终,计算技术的普及和发展以及研究人员使用它们的培训只会加速未来几代人的工具和学习资源的多样化。