Suppr超能文献

ATM-TCR:使用多头自注意力模型预测 TCR-表位结合亲和力。

ATM-TCR: TCR-Epitope Binding Affinity Prediction Using a Multi-Head Self-Attention Model.

机构信息

School of Computing and Augmented Intelligence, Arizona State University, Tempe, AZ, United States.

Biodesign Institute, Arizona State University, Tempe, AZ, United States.

出版信息

Front Immunol. 2022 Jul 6;13:893247. doi: 10.3389/fimmu.2022.893247. eCollection 2022.

Abstract

TCR-epitope pair binding is the key component for T cell regulation. The ability to predict whether a given pair binds is fundamental to understanding the underlying biology of the binding mechanism as well as developing T-cell mediated immunotherapy approaches. The advent of large-scale public databases containing TCR-epitope binding pairs enabled the recent development of computational prediction methods for TCR-epitope binding. However, the number of epitopes reported along with binding TCRs is far too small, resulting in poor out-of-sample performance for unseen epitopes. In order to address this issue, we present our model ATM-TCR which uses a multi-head self-attention mechanism to capture biological contextual information and improve generalization performance. Additionally, we present a novel application of the attention map from our model to improve out-of-sample performance by demonstrating on recent SARS-CoV-2 data.

摘要

TCR-表位对结合是 T 细胞调控的关键组成部分。预测给定对是否结合的能力对于理解结合机制的基础生物学以及开发 T 细胞介导的免疫治疗方法至关重要。包含 TCR-表位结合对的大型公共数据库的出现使得 TCR-表位结合的计算预测方法最近得到了发展。然而,与结合 TCR 一起报告的表位数量太少,导致对未见表位的样本外性能不佳。为了解决这个问题,我们提出了我们的模型 ATM-TCR,它使用多头自注意力机制来捕获生物学上下文信息并提高泛化性能。此外,我们还展示了我们模型的注意力图的新应用,通过对最近的 SARS-CoV-2 数据进行演示,提高了样本外性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ded7/9299376/0a70b22641ab/fimmu-13-893247-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验