Suppr超能文献

用于分子的图神经过程:对接分数评估及提高泛化能力的策略

Graph neural processes for molecules: an evaluation on docking scores and strategies to improve generalization.

作者信息

García-Ortegón Miguel, Seal Srijit, Rasmussen Carl, Bender Andreas, Bacallado Sergio

机构信息

Statistical Laboratory, University of Cambridge, Wilberforce Rd, Cambridge, CB3 0WA, UK.

Department of Engineering, University of Cambridge, Trumpington St, Cambridge, CB2 1PZ, UK.

出版信息

J Cheminform. 2024 Oct 23;16(1):115. doi: 10.1186/s13321-024-00904-2.

Abstract

Neural processes (NPs) are models for meta-learning which output uncertainty estimates. So far, most studies of NPs have focused on low-dimensional datasets of highly-correlated tasks. While these homogeneous datasets are useful for benchmarking, they may not be representative of realistic transfer learning. In particular, applications in scientific research may prove especially challenging due to the potential novelty of meta-testing tasks. Molecular property prediction is one such research area that is characterized by sparse datasets of many functions on a shared molecular space. In this paper, we study the application of graph NPs to molecular property prediction with DOCKSTRING, a diverse dataset of docking scores. Graph NPs show competitive performance in few-shot learning tasks relative to supervised learning baselines common in chemoinformatics, as well as alternative techniques for transfer learning and meta-learning. In order to increase meta-generalization to divergent test functions, we propose fine-tuning strategies that adapt the parameters of NPs. We find that adaptation can substantially increase NPs' regression performance while maintaining good calibration of uncertainty estimates. Finally, we present a Bayesian optimization experiment which showcases the potential advantages of NPs over Gaussian processes in iterative screening. Overall, our results suggest that NPs on molecular graphs hold great potential for molecular property prediction in the low-data setting. SCIENTIFIC CONTRIBUTION: Neural processes are a family of meta-learning algorithms which deal with data scarcity by transferring information across tasks and making probabilistic predictions. We evaluate their performance on regression and optimization molecular tasks using docking scores, finding them to outperform classical single-task and transfer-learning models. We examine the issue of generalization to divergent test tasks, which is a general concern of meta-learning algorithms in science, and propose strategies to alleviate it.

摘要

神经过程(NPs)是用于元学习的模型,可输出不确定性估计。到目前为止,大多数关于NPs的研究都集中在高度相关任务的低维数据集上。虽然这些同质数据集对于基准测试很有用,但它们可能无法代表现实的迁移学习。特别是,由于元测试任务的潜在新颖性,科学研究中的应用可能会特别具有挑战性。分子性质预测就是这样一个研究领域,其特点是在共享分子空间上有许多功能的稀疏数据集。在本文中,我们研究了图NPs在分子性质预测中的应用,使用了DOCKSTRING,这是一个多样化的对接分数数据集。相对于化学信息学中常见的监督学习基线以及迁移学习和元学习的替代技术,图NPs在少样本学习任务中表现出有竞争力的性能。为了增加对不同测试函数的元泛化能力,我们提出了微调策略来调整NPs的参数。我们发现,调整可以显著提高NPs的回归性能,同时保持不确定性估计的良好校准。最后,我们展示了一个贝叶斯优化实验,该实验展示了NPs在迭代筛选中相对于高斯过程的潜在优势。总体而言,我们的结果表明,分子图上的NPs在低数据设置下的分子性质预测中具有巨大潜力。

科学贡献

神经过程是一类元学习算法,通过跨任务传递信息并进行概率预测来处理数据稀缺问题。我们使用对接分数评估它们在回归和优化分子任务上的性能,发现它们优于经典的单任务和迁移学习模型。我们研究了对不同测试任务的泛化问题,这是科学中元学习算法普遍关注的问题,并提出了缓解该问题的策略。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d20c/11515514/47e0df8cd06b/13321_2024_904_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验