Suppr超能文献

分子变换统一了药物化学空间中的反应预测和反合成。

Molecular Transformer unifies reaction prediction and retrosynthesis across pharma chemical space.

机构信息

Cavendish Laboratory, University of Cambridge, Cambridge CB3 0HE, UK. aal44@.cam.ac.uk.

Medicine Design, Pfizer Inc., Cambridge, MA 02139, USA.

出版信息

Chem Commun (Camb). 2019 Oct 8;55(81):12152-12155. doi: 10.1039/c9cc05122h.

Abstract

Predicting how a complex molecule reacts with different reagents, and how to synthesise complex molecules from simpler starting materials, are fundamental to organic chemistry. We show that an attention-based machine translation model - Molecular Transformer - tackles both reaction prediction and retrosynthesis by learning from the same dataset. Reagents, reactants and products are represented as SMILES text strings. For reaction prediction, the model "translates" the SMILES of reactants and reagents to product SMILES, and the converse for retrosynthesis. Moreover, a model trained on publicly available data is able to make accurate predictions on proprietary molecules extracted from pharma electronic lab notebooks, demonstrating generalisability across chemical space. We expect our versatile framework to be broadly applicable to problems such as reaction condition prediction, reagent prediction and yield prediction.

摘要

预测复杂分子与不同试剂的反应方式,以及如何从更简单的起始原料合成复杂分子,是有机化学的基础。我们表明,基于注意力的机器翻译模型——分子转换器(Molecular Transformer)通过从同一数据集学习,可以解决反应预测和逆合成问题。试剂、反应物和产物都表示为 SMILES 文本字符串。对于反应预测,模型将反应物和试剂的 SMILES“翻译”为产物 SMILES,反之亦然,用于逆合成。此外,在公开数据上训练的模型能够对从制药电子实验室笔记本中提取的专有分子进行准确预测,证明了在化学空间中的泛化能力。我们期望我们的多功能框架能够广泛应用于反应条件预测、试剂预测和产率预测等问题。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验