Fan Yu, Li Jinxi, Song Shaoying, Zhang Haiguo, Wang Sijia, Zhai Guangtao
School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Shanghai, 200240 People's Republic of China.
State Key Laboratory of Genetic Engineering, Human Phenome Institute, Zhangjiang Fudan International Innovation Center, Fudan University, Shanghai, 201203 People's Republic of China.
Phenomics. 2022 Jun 22;2(4):219-229. doi: 10.1007/s43657-022-00063-0. eCollection 2022 Aug.
Palmprints are of long practical and cultural interest. Palmprint principal lines, also called primary palmar lines, are one of the most dominant palmprint features and do not change over the lifespan. The existing methods utilize filters and edge detection operators to get the principal lines from the palm region of interest (ROI), but can not distinguish the principal lines from fine wrinkles. This paper proposes a novel deep-learning architecture to extract palmprint principal lines, which could greatly reduce the influence of fine wrinkles, and classify palmprint phenotypes further from 2D palmprint images. This architecture includes three modules, ROI extraction module (REM) using pre-trained hand key point location model, principal line extraction module (PLEM) using deep edge detection model, and phenotype classifier (PC) based on ResNet34 network. Compared with the current ROI extraction method, our extraction is competitive with a success rate of 95.2%. For principal line extraction, the similarity score between our extracted lines and ground truth palmprint lines achieves 0.813. And the proposed architecture achieves a phenotype classification accuracy of 95.7% based on our self-built palmprint dataset CAS_Palm.
掌纹具有长期的实际应用价值和文化意义。掌纹主线,也称为主要掌纹线,是掌纹最显著的特征之一,且在人的一生中不会改变。现有的方法利用滤波器和边缘检测算子从感兴趣的手掌区域(ROI)获取主线,但无法将主线与细纹区分开来。本文提出了一种新颖的深度学习架构来提取掌纹主线,该架构可以大大减少细纹的影响,并从二维掌纹图像中进一步对手掌纹型进行分类。该架构包括三个模块:使用预训练手部关键点定位模型的感兴趣区域提取模块(REM)、使用深度边缘检测模型的主线提取模块(PLEM)以及基于ResNet34网络的表型分类器(PC)。与当前的感兴趣区域提取方法相比,我们的提取方法成功率为95.2%,具有竞争力。对于主线提取,我们提取的线与真实掌纹线之间的相似度得分达到0.813。基于我们自建的掌纹数据集CAS_Palm,所提出的架构实现了95.7%的表型分类准确率。