• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

ERT-GFAN:一种基于分子生物学和知识增强注意力机制的多模态药物-靶标相互作用预测模型。

ERT-GFAN: A multimodal drug-target interaction prediction model based on molecular biology and knowledge-enhanced attention mechanism.

机构信息

College of Computer Science and Technology, Qingdao University, Qingdao, 266071, China.

College of Computer Science and Technology, Qingdao University, Qingdao, 266071, China; School of Automation, Qingdao University, Qingdao, 266071, China.

出版信息

Comput Biol Med. 2024 Sep;180:109012. doi: 10.1016/j.compbiomed.2024.109012. Epub 2024 Aug 16.

DOI:10.1016/j.compbiomed.2024.109012
PMID:39153394
Abstract

In drug discovery, precisely identifying drug-target interactions is crucial for finding new drugs and understanding drug mechanisms. Evolving drug/target heterogeneous data presents challenges in obtaining multimodal representation in drug-target prediction(DTI). To deal with this, we propose 'ERT-GFAN', a multimodal drug-target interaction prediction model inspired by molecular biology. Firstly, it integrates bio-inspired principles to obtain structure feature of drugs and targets using Extended Connectivity Fingerprints(ECFP). Simultaneously, the knowledge graph embedding model RotatE is employed to discover the interaction feature of drug-target pairs. Subsequently, Transformer is utilized to refine the contextual neighborhood features from the obtained structure feature and interaction features, and multi-modal high-dimensional fusion features of the three-modal information constructed. Finally, the final DTI prediction results are outputted by integrating the multimodal fusion features into a graphical high-dimensional fusion feature attention network (GFAN) using our innovative multimodal high-dimensional fusion feature attention. This multimodal approach offers a comprehensive understanding of drug-target interactions, addressing challenges in complex knowledge graphs. By combining structure feature, interaction feature, and contextual neighborhood features, 'ERT-GFAN' excels in predicting DTI. Empirical evaluations on three datasets demonstrate our method's superior performance, with AUC of 0.9739, 0.9862, and 0.9667, AUPR of 0.9598, 0.9789, and 0.9750, and Mean Reciprocal Rank(MRR) of 0.7386, 0.7035, and 0.7133. Ablation studies show over a 5% improvement in predictive performance compared to baseline unimodal and bimodal models. These results, along with detailed case studies, highlight the efficacy and robustness of our approach.

摘要

在药物发现中,精确识别药物-靶标相互作用对于寻找新药和理解药物机制至关重要。不断发展的药物/靶标异质数据在药物-靶标预测(DTI)中获得多模态表示方面带来了挑战。为了解决这个问题,我们提出了“ERT-GFAN”,这是一种受分子生物学启发的多模态药物-靶标相互作用预测模型。首先,它集成了生物启发的原理,使用扩展连接指纹(ECFP)获取药物和靶标的结构特征。同时,使用知识图嵌入模型 RotatE 发现药物-靶对的相互作用特征。随后,Transformer 用于从获得的结构特征和相互作用特征中细化上下文邻域特征,并构建三模态信息的多模态高维融合特征。最后,通过将多模态融合特征集成到图形高维融合特征注意力网络(GFAN)中,使用我们创新的多模态高维融合特征注意力来输出最终的 DTI 预测结果。这种多模态方法提供了对药物-靶标相互作用的全面理解,解决了复杂知识图中的挑战。通过结合结构特征、相互作用特征和上下文邻域特征,“ERT-GFAN”在预测 DTI 方面表现出色。在三个数据集上的实证评估表明,我们的方法具有优越的性能,AUC 分别为 0.9739、0.9862 和 0.9667,AUPR 分别为 0.9598、0.9789 和 0.9750,MRR 分别为 0.7386、0.7035 和 0.7133。消融研究表明,与基线单模态和双模态模型相比,预测性能提高了 5%以上。这些结果以及详细的案例研究突出了我们方法的有效性和稳健性。

相似文献

1
ERT-GFAN: A multimodal drug-target interaction prediction model based on molecular biology and knowledge-enhanced attention mechanism.ERT-GFAN:一种基于分子生物学和知识增强注意力机制的多模态药物-靶标相互作用预测模型。
Comput Biol Med. 2024 Sep;180:109012. doi: 10.1016/j.compbiomed.2024.109012. Epub 2024 Aug 16.
2
DDINet: Drug-drug interaction prediction network based on multi-molecular fingerprint features and multi-head attention centered weighted autoencoder.DDINet:基于多分子指纹特征和多头注意力中心加权自动编码器的药物-药物相互作用预测网络。
J Bioinform Comput Biol. 2025 Feb;23(1):2550003. doi: 10.1142/S0219720025500039.
3
Trajectory-Ordered Objectives for Self-Supervised Representation Learning of Temporal Healthcare Data Using Transformers: Model Development and Evaluation Study.使用Transformer进行时间序列医疗数据自监督表示学习的轨迹有序目标:模型开发与评估研究
JMIR Med Inform. 2025 Jun 4;13:e68138. doi: 10.2196/68138.
4
The clinical effectiveness and cost-effectiveness of enzyme replacement therapy for Gaucher's disease: a systematic review.戈谢病酶替代疗法的临床疗效和成本效益:一项系统评价。
Health Technol Assess. 2006 Jul;10(24):iii-iv, ix-136. doi: 10.3310/hta10240.
5
Improving drug-target interaction prediction through dual-modality fusion with InteractNet.通过与 InteractNet 的双模融合提高药物-靶点相互作用预测
J Bioinform Comput Biol. 2024 Oct;22(5):2450024. doi: 10.1142/S0219720024500240. Epub 2024 Nov 11.
6
PfgPDI: Pocket feature-enabled graph neural network for protein-drug interaction prediction.PfgPDI:用于蛋白质-药物相互作用预测的具有口袋特征的图神经网络
J Bioinform Comput Biol. 2024 Apr;22(2):2450004. doi: 10.1142/S0219720024500045. Epub 2024 May 27.
7
Systemic pharmacological treatments for chronic plaque psoriasis: a network meta-analysis.系统性药理学治疗慢性斑块状银屑病:网络荟萃分析。
Cochrane Database Syst Rev. 2021 Apr 19;4(4):CD011535. doi: 10.1002/14651858.CD011535.pub4.
8
ASAP-DTA: Predicting drug-target binding affinity with adaptive structure aware networks.ASAP-DTA:使用自适应结构感知网络预测药物-靶点结合亲和力。
J Bioinform Comput Biol. 2024 Dec;22(6):2450028. doi: 10.1142/S0219720024500288. Epub 2025 Feb 1.
9
TLTNet: A novel transscale cascade layered transformer network for enhanced retinal blood vessel segmentation.TLTNet:一种新颖的跨尺度级联分层Transformer 网络,用于增强视网膜血管分割。
Comput Biol Med. 2024 Aug;178:108773. doi: 10.1016/j.compbiomed.2024.108773. Epub 2024 Jun 25.
10
Integration of autoencoder and graph convolutional network for predicting breast cancer drug response.基于自动编码器和图卷积网络的乳腺癌药物反应预测
J Bioinform Comput Biol. 2024 Jun;22(3):2450013. doi: 10.1142/S0219720024500136.

引用本文的文献

1
Artificial Intelligence and Network Medicine: Path to Precision Medicine.人工智能与网络医学:通往精准医学之路
NEJM AI. 2025 Sep;2(9). doi: 10.1056/aira2401229. Epub 2025 Aug 28.