Shanmugam Ashok, Radhabai Prianka Ramachandran, Kvn Kavitha, Imoize Agbotiname Lucky
Department of Electronics and Communication Engineering, Vel Tech Multi Tech Dr. Rangarajan Dr. Sakunthala Engineering College, Chennai, Tamil Nadu, India.
Department of CSE, Manipal Institute of Technology, Bangalore, Karnataka, India.
BMC Med Imaging. 2025 Aug 4;25(1):315. doi: 10.1186/s12880-025-01852-5.
Accurately segmenting the pancreas from abdominal computed tomography (CT) images is crucial for detecting and managing pancreatic diseases, such as diabetes and tumors. Type 2 diabetes and metabolic syndrome are associated with pancreatic fat accumulation. Calculating the fat fraction aids in the investigation of β-cell malfunction and insulin resistance. The most widely used pancreas segmentation technique is a U-shaped network based on deep convolutional neural networks (DCNNs). They struggle to capture long-range biases in an image because they rely on local receptive fields. This research proposes a novel dual Self-attentive Transformer Unet (DSTUnet) model for accurate pancreatic segmentation, addressing this problem. This model incorporates dual self-attention Swin transformers on both the encoder and decoder sides to facilitate global context extraction and refine candidate regions. After segmenting the pancreas using a DSTUnet, a histogram analysis is used to estimate the fat fraction. The suggested method demonstrated excellent performance on the standard dataset, achieving a DSC of 93.7% and an HD of 2.7 mm. The average volume of the pancreas was 92.42, and its fat volume fraction (FVF) was 13.37%.
从腹部计算机断层扫描(CT)图像中准确分割胰腺对于检测和管理胰腺疾病(如糖尿病和肿瘤)至关重要。2型糖尿病和代谢综合征与胰腺脂肪堆积有关。计算脂肪分数有助于研究β细胞功能障碍和胰岛素抵抗。最广泛使用的胰腺分割技术是基于深度卷积神经网络(DCNN)的U形网络。由于它们依赖局部感受野,因此难以捕捉图像中的远距离偏差。本研究提出了一种用于精确胰腺分割的新型双自注意力Transformer Unet(DSTUnet)模型,以解决这一问题。该模型在编码器和解码器两侧都融入了双自注意力Swin Transformer,以促进全局上下文提取并细化候选区域。使用DSTUnet分割胰腺后,通过直方图分析来估计脂肪分数。所提出的方法在标准数据集上表现出色,DSC达到93.7%,HD为2.7毫米。胰腺的平均体积为92.42,其脂肪体积分数(FVF)为13.37%。
J Med Imaging (Bellingham). 2025-3
J Med Internet Res. 2025-1-29
Int Ophthalmol. 2025-6-27
Med Image Anal. 2025-1
Acad Radiol. 2024-11
Comput Struct Biotechnol J. 2024-3-19