Jiang Jinyun, Sun Zitong, Zhang Qile, Lan Kun, Jiang Xiaoliang, Wu Jun
College of Mechanical Engineering, Quzhou University, Quzhou, China.
Department of Rehabilitation, The Quzhou Affiliated Hospital of Wenzhou Medical University, Quzhou People's Hospital, Quzhou, China.
Front Physiol. 2023 Jun 20;14:1173108. doi: 10.3389/fphys.2023.1173108. eCollection 2023.
Accurate segmentation of skin lesions in dermoscopic images plays an important role in improving the survival rate of patients. However, due to the blurred boundaries of pigment regions, the diversity of lesion features, and the mutations and metastases of diseased cells, the effectiveness and robustness of skin image segmentation algorithms are still a challenging subject. For this reason, we proposed a bi-directional feedback dense connection network framework (called BiDFDC-Net), which can perform skin lesions accurately. Firstly, under the framework of U-Net, we integrated the edge modules into each layer of the encoder which can solve the problem of gradient vanishing and network information loss caused by network deepening. Then, each layer of our model takes input from the previous layer and passes its feature map to the densely connected network of subsequent layers to achieve information interaction and enhance feature propagation and reuse. Finally, in the decoder stage, a two-branch module was used to feed the dense feedback branch and the ordinary feedback branch back to the same layer of coding, to realize the fusion of multi-scale features and multi-level context information. By testing on the two datasets of ISIC-2018 and PH2, the accuracy on the two datasets was given by 93.51% and 94.58%, respectively.
在皮肤镜图像中准确分割皮肤病变对于提高患者生存率起着重要作用。然而,由于色素区域边界模糊、病变特征多样以及病变细胞的突变和转移,皮肤图像分割算法的有效性和鲁棒性仍然是一个具有挑战性的课题。因此,我们提出了一种双向反馈密集连接网络框架(称为BiDFDC-Net),它可以准确地进行皮肤病变分割。首先,在U-Net框架下,我们将边缘模块集成到编码器的每一层中,这可以解决因网络加深而导致的梯度消失和网络信息丢失问题。然后,我们模型的每一层从前一层获取输入,并将其特征图传递到后续层的密集连接网络中,以实现信息交互并增强特征传播和重用。最后,在解码器阶段,使用一个双分支模块将密集反馈分支和普通反馈分支反馈到同一层编码中,以实现多尺度特征和多层次上下文信息的融合。通过在ISIC-2018和PH2这两个数据集上进行测试,两个数据集上的准确率分别为93.51%和94.58%。