College of Biological and Chemical Engineering, Zhejiang University of Science and Technology, Hangzhou, 310023, China.
School of Sciences, Zhejiang University of Science and Technology, Hangzhou, 310023, China.
Sci Rep. 2024 Sep 27;14(1):22308. doi: 10.1038/s41598-024-73356-1.
Single-cell RNA sequencing (scRNA-seq) is a key technology for investigating cell development and analysing cell diversity across various diseases. However, the high dimensionality and extreme sparsity of scRNA-seq data pose great challenges for accurate cell type annotation. To address this, we developed a new cell-type annotation model called scGAA (general gated axial-attention model for accurate cell-type annotation of scRNA-seq). Based on the transformer framework, the model decomposes the traditional self-attention mechanism into horizontal and vertical attention, considerably improving computational efficiency. This axial attention mechanism can process high-dimensional data more efficiently while maintaining reasonable model complexity. Additionally, the gated unit was integrated into the model to enhance the capture of relationships between genes, which is crucial for achieving an accurate cell type annotation. The results revealed that our improved transformer model is a promising tool for practical applications. This theoretical innovation increased the model performance and provided new insights into analytical tools for scRNA-seq data.
单细胞 RNA 测序(scRNA-seq)是研究细胞发育和分析各种疾病中细胞多样性的关键技术。然而,scRNA-seq 数据的高维性和极度稀疏性给准确的细胞类型注释带来了巨大挑战。为了解决这个问题,我们开发了一种新的细胞类型注释模型,称为 scGAA(用于 scRNA-seq 中准确细胞类型注释的通用门控轴向注意模型)。该模型基于转换器框架,将传统的自注意力机制分解为水平和垂直注意力,大大提高了计算效率。这种轴向注意力机制可以更有效地处理高维数据,同时保持合理的模型复杂度。此外,门控单元被集成到模型中,以增强基因之间关系的捕捉,这对于实现准确的细胞类型注释至关重要。结果表明,我们改进的转换器模型是一种有前途的实用工具。这一理论创新提高了模型性能,并为 scRNA-seq 数据分析工具提供了新的见解。