School of Data Science, University of Science and Technology of China, Hefei, Anhui, China.
Department of Statistics and Finance, School of Management, University of Science and Technology of China, Hefei, Anhui, China.
Comput Med Imaging Graph. 2023 Apr;105:102189. doi: 10.1016/j.compmedimag.2023.102189. Epub 2023 Jan 24.
Self-attention mechanism-based algorithms are attractive in digital pathology due to their interpretability, but suffer from computation complexity. This paper presents a novel, lightweight Attention-based Multiple Instance Mutation Learning (AMIML) model to allow small-scale attention operations for predicting gene mutations. Compared to the standard self-attention model, AMIML reduces the number of model parameters by approximately 70%. Using data for 24 clinically relevant genes from four cancer cohorts in TCGA studies (UCEC, BRCA, GBM, and KIRC), we compare AMIML with a standard self-attention model, five other deep learning models, and four traditional machine learning models. The results show that AMIML has excellent robustness and outperforms all the baseline algorithms in the vast majority of the tested genes. Conversely, the performance of the reference deep learning and machine learning models vary across different genes, and produce suboptimal prediction for certain genes. Furthermore, with the flexible and interpretable attention-based pooling mechanism, AMIML can further zero in and detect predictive image patches.
基于自注意力机制的算法在数字病理学中很有吸引力,因为它们具有可解释性,但存在计算复杂度的问题。本文提出了一种新颖的轻量级基于注意力的多实例突变学习(AMIML)模型,允许进行小规模的注意力操作,以预测基因突变。与标准的自注意力模型相比,AMIML 将模型参数的数量减少了约 70%。使用来自 TCGA 研究中四个癌症队列的 24 个临床相关基因的数据(UCEC、BRCA、GBM 和 KIRC),我们将 AMIML 与标准的自注意力模型、五个其他深度学习模型和四个传统的机器学习模型进行了比较。结果表明,AMIML 具有出色的鲁棒性,在绝大多数测试基因中都优于所有基线算法。相反,参考深度学习和机器学习模型的性能在不同的基因之间有所差异,并且对某些基因的预测效果不佳。此外,AMIML 具有灵活且可解释的基于注意力的池化机制,可进一步聚焦并检测出具有预测性的图像补丁。