Jia Qiran, Shu Hai
Department of Biostatistics, School of Global Public Health, New York University, New York, NY 10003, USA.
Brainlesion. 2021 Sep;2021:3-14. doi: 10.1007/978-3-031-09002-8_1. Epub 2022 Jul 15.
Convolutional neural networks (CNNs) have achieved remarkable success in automatically segmenting organs or lesions on 3D medical images. Recently, vision transformer networks have exhibited exceptional performance in 2D image classification tasks. Compared with CNNs, transformer networks have an appealing advantage of extracting long-range features due to their self-attention algorithm. Therefore, we propose a CNN-Transformer combined model, called BiTr-Unet, with specific modifications for brain tumor segmentation on multi-modal MRI scans. Our BiTr-Unet achieves good performance on the BraTS2021 validation dataset with median Dice score 0.9335, 0.9304 and 0.8899, and median Hausdor_ distance 2.8284, 2.2361 and 1.4142 for the whole tumor, tumor core, and enhancing tumor, respectively. On the BraTS2021 testing dataset, the corresponding results are 0.9257, 0.9350 and 0.8874 for Dice score, and 3, 2.2361 and 1.4142 for Hausdorff distance. The code is publicly available at https://github.com/JustaTinyDot/BiTr-Unet.
卷积神经网络(CNNs)在自动分割3D医学图像上的器官或病变方面取得了显著成功。最近,视觉Transformer网络在2D图像分类任务中表现出卓越性能。与CNNs相比,Transformer网络因其自注意力算法在提取长距离特征方面具有吸引人的优势。因此,我们提出了一种CNN-Transformer组合模型,称为BiTr-Unet,并对其进行了特定修改,用于多模态MRI扫描的脑肿瘤分割。我们的BiTr-Unet在BraTS2021验证数据集上取得了良好性能,全肿瘤、肿瘤核心和增强肿瘤的Dice分数中位数分别为0.9335、0.9304和0.8899,Hausdorff距离中位数分别为2.8284、2.2361和1.4142。在BraTS2021测试数据集上,Dice分数的相应结果为0.9257、0.9350和0.8874,Hausdorff距离为3、2.2361和1.4142。代码可在https://github.com/JustaTinyDot/BiTr-Unet上公开获取。