College of Information Science and Engineering, Xinjiang University, Urumqi, 830046, China.
Sci Rep. 2023 Apr 18;13(1):6342. doi: 10.1038/s41598-023-32813-z.
Medical image segmentation provides various effective methods for accuracy and robustness of organ segmentation, lesion detection, and classification. Medical images have fixed structures, simple semantics, and diverse details, and thus fusing rich multi-scale features can augment segmentation accuracy. Given that the density of diseased tissue may be comparable to that of surrounding normal tissue, both global and local information are critical for segmentation results. Therefore, considering the importance of multi-scale, global, and local information, in this paper, we propose the dynamic hierarchical multi-scale fusion network with axial mlp (multilayer perceptron) (DHMF-MLP), which integrates the proposed hierarchical multi-scale fusion (HMSF) module. Specifically, HMSF not only reduces the loss of detail information by integrating the features of each stage of the encoder, but also has different receptive fields, thereby improving the segmentation results for small lesions and multi-lesion regions. In HMSF, we not only propose the adaptive attention mechanism (ASAM) to adaptively adjust the semantic conflicts arising during the fusion process but also introduce Axial-mlp to improve the global modeling capability of the network. Extensive experiments on public datasets confirm the excellent performance of our proposed DHMF-MLP. In particular, on the BUSI, ISIC 2018, and GlaS datasets, IoU reaches 70.65%, 83.46%, and 87.04%, respectively.
医学图像分割为器官分割、病变检测和分类的准确性和健壮性提供了各种有效的方法。医学图像具有固定的结构、简单的语义和多样的细节,因此融合丰富的多尺度特征可以提高分割精度。由于病变组织的密度可能与周围正常组织相当,因此全局和局部信息对于分割结果都很关键。因此,考虑到多尺度、全局和局部信息的重要性,在本文中,我们提出了具有轴向 mlp(多层感知机)(DHMF-MLP)的动态分层多尺度融合网络,该网络集成了所提出的分层多尺度融合(HMSF)模块。具体来说,HMSF 不仅通过整合编码器各阶段的特征来减少细节信息的损失,而且还具有不同的感受野,从而提高了小病变和多病变区域的分割结果。在 HMSF 中,我们不仅提出了自适应注意力机制(ASAM)来自适应地调整融合过程中出现的语义冲突,还引入了轴向 mlp 来提高网络的全局建模能力。在公共数据集上的广泛实验证实了我们提出的 DHMF-MLP 的卓越性能。特别是在 BUSI、ISIC 2018 和 GlA 数据集上,IoU 分别达到 70.65%、83.46%和 87.04%。