College of Computer and Information, Hohai University, Nanjing 210098, China.
College of Computer and Information, Hohai University, Nanjing 210098, China.
Comput Methods Programs Biomed. 2024 Apr;247:108109. doi: 10.1016/j.cmpb.2024.108109. Epub 2024 Mar 8.
Automatic needle tip detection is important in real-time ultrasound (US) images that are utilized to guide interventional needle puncture procedures in clinical settings. However, due to the spatial indiscernibility problem caused by the severe background interferences and the tip characteristics of small size, being grayscale and indistinctive appearance patterns, tip detection in US images is challenging.
To achieve precise tip detection in US images against spatial indiscernibility, a novel multi-keyframe motion-aware framework called TipDet is proposed. It can identify tips based on their short-term spatial-temporal pattern and long-term motion pattern. In TipDet, first, an adaptive keyframe model (AKM) is proposed to decide whether a frame is informative to serve as a keyframe for long-term motion pattern learning. Second, candidate tip detection is conducted using a two-stream backbone (TSB) based on their short-term spatial-temporal pattern. Third, to further identify the true one in the candidate tips, a novel method for learning the long-term motion pattern of the tips is proposed based on the proposed optical-flow-aware multi-head cross-attention (OFA-MHCA).
On the clinical human puncture dataset, which includes 4195 B-mode images, the experimental results show that the proposed TipDet can achieve precise tip detection against the spatial indiscernibility problem, achieving 78.7 % AP and 8.9 % improvement over the base detector at approximately 20 FPS. Moreover, a tip localization error of 1.3±0.6 % is achieved, exceeding the existing method.
The proposed TipDet can facilitate a wider and easier application of US-guided interventional procedures by providing robust and precise needle tip localization. The codes and data are available at https://github.com/ResonWang/TipDet.
在临床环境中用于引导介入性针穿刺程序的实时超声(US)图像中,自动针尖检测非常重要。然而,由于严重的背景干扰以及针尖小尺寸、灰度和不明显外观模式的空间不可分辨问题,US 图像中的针尖检测具有挑战性。
为了实现 US 图像中针对空间不可分辨的精确针尖检测,提出了一种称为 TipDet 的新颖多关键帧运动感知框架。它可以基于其短期时空模式和长期运动模式识别针尖。在 TipDet 中,首先提出了一种自适应关键帧模型(AKM),用于确定一个帧是否提供有关信息以作为长期运动模式学习的关键帧。其次,使用基于短期时空模式的双流骨干网(TSB)进行候选针尖检测。第三,为了进一步识别候选针尖中的真实针尖,提出了一种基于所提出的光流感知多头交叉注意力(OFA-MHCA)学习针尖长期运动模式的新方法。
在包括 4195 个 B 模式图像的临床人体穿刺数据集上,实验结果表明,所提出的 TipDet 可以针对空间不可分辨问题实现精确的针尖检测,在大约 20 FPS 时相对于基础检测器实现了 78.7%的 AP 和 8.9%的提高。此外,实现了 1.3±0.6%的针尖定位误差,超过了现有方法。
所提出的 TipDet 通过提供稳健和精确的针尖定位,可以促进更广泛和更容易地应用 US 引导的介入性程序。代码和数据可在 https://github.com/ResonWang/TipDet 上获得。