IEEE Trans Biomed Eng. 2023 Mar;70(3):970-979. doi: 10.1109/TBME.2022.3206596. Epub 2023 Feb 17.
Transrectal ultrasound is commonly used for guiding prostate cancer biopsy, where 3D ultrasound volume reconstruction is often desired. Current methods for 3D reconstruction from freehand ultrasound scans require external tracking devices to provide spatial information of an ultrasound transducer. This paper presents a novel deep learning approach for sensorless ultrasound volume reconstruction, which efficiently exploits content correspondence between ultrasound frames to reconstruct 3D volumes without external tracking. The underlying deep learning model, deep contextual-contrastive network (DC -Net), utilizes self-attention to focus on the speckle-rich areas to estimate spatial movement and then minimizes a margin ranking loss for contrastive feature learning. A case-wise correlation loss over the entire input video helps further smooth the estimated trajectory. We train and validate DC -Net on two independent datasets, one containing 619 transrectal scans and the other having 100 transperineal scans. Our proposed approach attained superior performance compared with other methods, with a drift rate of 9.64 % and a prostate Dice of 0.89. The promising results demonstrate the capability of deep neural networks for universal ultrasound volume reconstruction from freehand 2D ultrasound scans without tracking information.
经直肠超声常用于引导前列腺癌活检,通常需要进行 3D 超声体积重建。目前,从自由式超声扫描中进行 3D 重建的方法需要外部跟踪设备来提供超声换能器的空间信息。本文提出了一种新的无传感器超声体积重建的深度学习方法,该方法有效地利用了超声帧之间的内容对应关系,无需外部跟踪即可重建 3D 体积。基础的深度学习模型,深度上下文对比网络(DC-Net),利用自注意力来关注斑点丰富的区域,以估计空间运动,然后最小化边缘排序损失进行对比特征学习。整个输入视频的逐例相关损失有助于进一步平滑估计的轨迹。我们在两个独立的数据集上训练和验证了 DC-Net,一个包含 619 次经直肠扫描,另一个包含 100 次经会阴扫描。与其他方法相比,我们提出的方法表现出色,漂移率为 9.64%,前列腺 Dice 系数为 0.89。有前景的结果表明,深度神经网络具有从自由式 2D 超声扫描中进行通用超声体积重建的能力,而无需跟踪信息。