Economopoulos T L, Asvestas P A, Matsopoulos G K
School of Electrical and Computer Engineering, National Technical University of Athens, 9, Iroon Polytechniou str, Zografos, 15780, Athens, Greece.
J Digit Imaging. 2010 Aug;23(4):399-421. doi: 10.1007/s10278-009-9190-z. Epub 2009 Mar 3.
The accurate estimation of point correspondences is often required in a wide variety of medical image-processing applications. Numerous point correspondence methods have been proposed in this field, each exhibiting its own characteristics, strengths, and weaknesses. This paper presents a comprehensive comparison of four automatic methods for allocating corresponding points, namely the template-matching technique, the iterative closest points approach, the correspondence by sensitivity to movement scheme, and the self-organizing maps algorithm. Initially, the four correspondence methods are described focusing on their distinct characteristics and their parameter selection for common comparisons. The performance of the four methods is then qualitatively and quantitatively compared over a total of 132 two-dimensional image pairs divided into eight sets. The sets comprise of pairs of images obtained using controlled geometry protocols (affine and sinusoidal transforms) and pairs of images subject to unknown transformations. The four methods are statistically evaluated pairwise on all image pairs and individually in terms of specific features of merit based on the correspondence accuracy as well as the registration accuracy. After assessing these evaluation criteria for each method, it was deduced that the self-organizing maps approach outperformed in most cases the other three methods in comparison.
在各种各样的医学图像处理应用中,通常需要精确估计点对应关系。该领域已经提出了许多点对应方法,每种方法都有其自身的特点、优点和缺点。本文对四种自动分配对应点的方法进行了全面比较,即模板匹配技术、迭代最近点方法、基于运动敏感性的对应方法和自组织映射算法。首先,描述这四种对应方法,重点介绍它们的不同特点以及用于共同比较的参数选择。然后,在总共132对二维图像对上对这四种方法的性能进行定性和定量比较,这些图像对分为八组。这些组包括使用受控几何协议(仿射和正弦变换)获得的图像对以及经历未知变换的图像对。基于对应精度和配准精度,对所有图像对两两进行统计评估,并根据特定的优点特征对这四种方法进行单独评估。在评估每种方法的这些评估标准后,得出结论:自组织映射方法在大多数情况下比其他三种方法表现更好。