Xi Yue, Li Xiaoxia, Wang Zhikang, Shi Chuanji, Qin Xiaoru, Jiang Qifeng, Yang Guoli
Stomatology Hospital, School of Stomatology, Zhejiang University School of Medicine, Zhejiang Provincial Clinical Research Center of Oral Diseases, Key Laboratory of Oral Biomedical Research of Zhejiang Province, Cancer Center of Zhejiang University, Hangzhou, China.
Ant Group Hangzhou, China.
Clin Implant Dent Relat Res. 2025 Feb;27(1):e13426. doi: 10.1111/cid.13426. Epub 2024 Dec 16.
Accurate assessment of postoperative bone graft material changes after the 1-stage sinus lift is crucial for evaluating long-term implant survival. However, traditional manual labeling and segmentation of cone-beam computed tomography (CBCT) images are often inaccurate and inefficient. This study aims to utilize artificial intelligence for automated segmentation of graft material in 1-stage sinus lift procedures to enhance accuracy and efficiency.
Swin-UPerNet along with mainstream medical segmentation models, such as FCN, U-Net, DeepLabV3, SegFormer, and UPerNet, were trained using a dataset of 120 CBCT scans. The models were tested on 30 CBCT scans to evaluate model performance based on metrics including the 95% Hausdorff distance, Intersection over Union (IoU), and Dice similarity coefficient. Additionally, processing times were also compared between automated segmentation and manual methods.
Swin-UPerNet outperformed other models in accuracy, achieving an accuracy rate of 0.84 and mean precision and IoU values of 0.8574 and 0.7373, respectively (p < 0.05). The time required for uploading and visualizing segmentation results with Swin-UPerNet significantly decreased to 19.28 s from the average manual segmentation times of 1390 s (p < 0.001).
Swin-UPerNet exhibited high accuracy and efficiency in identifying and segmenting the three-dimensional volume of bone graft material, indicating significant potential for evaluating the stability of bone graft material.
准确评估一期上颌窦提升术后骨移植材料的变化对于评估种植体的长期存活率至关重要。然而,传统的锥束计算机断层扫描(CBCT)图像手动标记和分割往往不准确且效率低下。本研究旨在利用人工智能对一期上颌窦提升术的移植材料进行自动分割,以提高准确性和效率。
使用包含120例CBCT扫描的数据集对Swin-UPerNet以及主流医学分割模型(如FCN、U-Net、DeepLabV3、SegFormer和UPerNet)进行训练。在30例CBCT扫描上对模型进行测试,基于95%豪斯多夫距离、交并比(IoU)和骰子相似系数等指标评估模型性能。此外,还比较了自动分割和手动方法之间的处理时间。
Swin-UPerNet在准确性方面优于其他模型,准确率达到0.84,平均精度和IoU值分别为0.8574和0.7373(p<0.05)。使用Swin-UPerNet上传和可视化分割结果所需的时间从平均手动分割时间1390秒显著减少到19.28秒(p<0.001)。
Swin-UPerNet在识别和分割骨移植材料的三维体积方面表现出高准确性和效率,表明在评估骨移植材料稳定性方面具有巨大潜力。