School of Computer and Computational Science, Zhejiang University City College, Hangzhou, 310011, China.
School of Computer Science and Technology, Zhejiang University, Hangzhou, 310013, China.
Comput Methods Programs Biomed. 2022 Jun;221:106925. doi: 10.1016/j.cmpb.2022.106925. Epub 2022 May 30.
Because the appearance, shape and location of brain tumors vary greatly among different patients, brain tumor segmentation (BTS) is extremely challenging. Recently, many studies have used attention mechanisms to solve this problem, which can be roughly divided into two categories: the spatial attention based on convolution (with or without channel attention) and self-attention. Due to the limitation of convolution operations, the spatial attention based on convolution cannot learn global dependencies very well, resulting in poor performance in BTS. A simple improvement idea is to directly substitute it with self-attention, which has an excellent ability to learn global dependencies. Since self-attention is not friendly to GPU memory, this simple substitution will make the new attention mechanism unable to be applied to high-resolution low-level feature maps, which contain considerable geometric information and are also important for improving the performance of attention mechanism in BTS.
In this paper, we propose a hierarchical fully connected module, named H-FC, to learn global dependencies. H-FC learns local dependencies at different feature map scales through fully connected layers hierarchically, and then combines these local dependencies as approximations of the global dependencies. H-FC requires very little GPU memory and can easily replace spatial attention module based on convolution operation, such as Attention Gate and SAM (in CBAM), to improve the performance of attention mechanisms in BTS.
Adequate comparative experiments illustrate that H-FC performs better than Attention Gate and SAM (in CBAM), which lack the ability to learn global dependencies, in BTS, with improvements in most metrics and a larger improvement in Hausdorff Distance. By comparing the amount of calculation and parameters of the model before and after adding H-FC, it is prove that H-FC is light-weight.
In this paper, we propose a novel H-FC to learn global dependencies. We illustrate the effectiveness of H-FC through experiments on BraTS2020 dataset. We mainly explore the influence of the region size and the number of steps on the performance of H-FC. We also confirm that the global dependencies of low-level feature maps are also important to BTS. We show that H-FC is light-weight through a time and space complexity analysis and the experimental results.
由于脑肿瘤在不同患者间的外观、形状和位置存在较大差异,脑肿瘤分割(BTS)极具挑战性。最近,许多研究使用注意力机制来解决这一问题,大致可分为两类:基于卷积的空间注意力(带或不带通道注意力)和自注意力。由于卷积运算的局限性,基于卷积的空间注意力无法很好地学习全局依赖关系,导致 BTS 性能较差。一个简单的改进思路是直接用自注意力来替代它,自注意力在学习全局依赖关系方面具有出色的能力。由于自注意力对 GPU 内存不友好,这种简单的替代将使新的注意力机制无法应用于包含大量几何信息的高分辨率低层次特征图,而这些信息对于提高注意力机制在 BTS 中的性能也很重要。
本文提出了一种分层全连接模块,命名为 H-FC,用于学习全局依赖关系。H-FC 通过分层的全连接层在不同的特征图尺度上学习局部依赖关系,然后将这些局部依赖关系组合为全局依赖关系的近似值。H-FC 需要的 GPU 内存很少,可以轻松替代基于卷积运算的空间注意力模块,如 Attention Gate 和 SAM(在 CBAM 中),从而提高注意力机制在 BTS 中的性能。
充分的对比实验表明,在 BTS 中,缺乏学习全局依赖关系能力的 Attention Gate 和 SAM(在 CBAM 中)表现不如 H-FC,在大多数指标上都有改进,Hausdorff 距离的改进更大。通过比较添加 H-FC 前后模型的计算量和参数,证明 H-FC 是轻量级的。
本文提出了一种新的 H-FC 来学习全局依赖关系。我们在 BraTS2020 数据集上的实验证明了 H-FC 的有效性。我们主要探索了区域大小和步骤数量对 H-FC 性能的影响。我们还证实,低层次特征图的全局依赖关系对 BTS 也很重要。通过时间和空间复杂度分析以及实验结果,我们证明了 H-FC 是轻量级的。