Hong Danfeng, Su Jian, Hong Qinggen, Pan Zhenkuan, Wang Guodong
College of Information Engineering, Qingdao University, Qingdao, China.
School of Communications and Information Engineering, University of Electronic Science and Technology of China, Chengdu, China.
PLoS One. 2014 Jul 3;9(7):e101866. doi: 10.1371/journal.pone.0101866. eCollection 2014.
As palmprints are captured using non-contact devices, image blur is inevitably generated because of the defocused status. This degrades the recognition performance of the system. To solve this problem, we propose a stable-feature extraction method based on a Vese-Osher (VO) decomposition model to recognize blurred palmprints effectively. A Gaussian defocus degradation model is first established to simulate image blur. With different degrees of blurring, stable features are found to exist in the image which can be investigated by analyzing the blur theoretically. Then, a VO decomposition model is used to obtain structure and texture layers of the blurred palmprint images. The structure layer is stable for different degrees of blurring (this is a theoretical conclusion that needs to be further proved via experiment). Next, an algorithm based on weighted robustness histogram of oriented gradients (WRHOG) is designed to extract the stable features from the structure layer of the blurred palmprint image. Finally, a normalized correlation coefficient is introduced to measure the similarity in the palmprint features. We also designed and performed a series of experiments to show the benefits of the proposed method. The experimental results are used to demonstrate the theoretical conclusion that the structure layer is stable for different blurring scales. The WRHOG method also proves to be an advanced and robust method of distinguishing blurred palmprints. The recognition results obtained using the proposed method and data from two palmprint databases (PolyU and Blurred-PolyU) are stable and superior in comparison to previous high-performance methods (the equal error rate is only 0.132%). In addition, the authentication time is less than 1.3 s, which is fast enough to meet real-time demands. Therefore, the proposed method is a feasible way of implementing blurred palmprint recognition.
由于掌纹是使用非接触式设备采集的,由于散焦状态不可避免地会产生图像模糊。这会降低系统的识别性能。为了解决这个问题,我们提出了一种基于Vese-Osher(VO)分解模型的稳定特征提取方法,以有效地识别模糊的掌纹。首先建立高斯散焦退化模型来模拟图像模糊。在不同程度的模糊情况下,发现图像中存在稳定特征,可以通过理论分析模糊来研究这些特征。然后,使用VO分解模型获得模糊掌纹图像的结构层和纹理层。结构层对于不同程度的模糊是稳定的(这是一个理论结论,需要通过实验进一步证明)。接下来,设计了一种基于加权方向梯度鲁棒直方图(WRHOG)的算法,从模糊掌纹图像的结构层中提取稳定特征。最后,引入归一化相关系数来衡量掌纹特征的相似度。我们还设计并进行了一系列实验来展示所提方法的优势。实验结果用于证明结构层对于不同模糊尺度是稳定的这一理论结论。WRHOG方法也被证明是一种先进且鲁棒的区分模糊掌纹的方法。使用所提方法获得的识别结果以及来自两个掌纹数据库(香港理工大学数据库和模糊香港理工大学数据库)的数据与以前的高性能方法相比是稳定且优越的(等错误率仅为0.132%)。此外,认证时间小于1.3秒,足够快以满足实时需求。因此,所提方法是实现模糊掌纹识别的一种可行方法。