Liang Bin, Yang Na, He Guosheng, Huang Peng, Yang Yong
Department of Radiation Oncology, National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China.
South Building #2 Division, The 3rd Medical Center of the People's Liberation Army General Hospital, Beijing, China.
J Med Internet Res. 2020 Apr 29;22(4):e17234. doi: 10.2196/17234.
Cancer has become the second leading cause of death globally. Most cancer cases are due to genetic mutations, which affect metabolism and result in facial changes.
In this study, we aimed to identify the facial features of patients with cancer using the deep learning technique.
Images of faces of patients with cancer were collected to build the cancer face image data set. A face image data set of people without cancer was built by randomly selecting images from the publicly available MegaAge data set according to the sex and age distribution of the cancer face image data set. Each face image was preprocessed to obtain an upright centered face chip, following which the background was filtered out to exclude the effects of nonrelative factors. A residual neural network was constructed to classify cancer and noncancer cases. Transfer learning, minibatches, few epochs, L2 regulation, and random dropout training strategies were used to prevent overfitting. Moreover, guided gradient-weighted class activation mapping was used to reveal the relevant features.
A total of 8124 face images of patients with cancer (men: n=3851, 47.4%; women: n=4273, 52.6%) were collected from January 2018 to January 2019. The ages of the patients ranged from 1 year to 70 years (median age 52 years). The average faces of both male and female patients with cancer displayed more obvious facial adiposity than the average faces of people without cancer, which was supported by a landmark comparison. When testing the data set, the training process was terminated after 5 epochs. The area under the receiver operating characteristic curve was 0.94, and the accuracy rate was 0.82. The main relative feature of cancer cases was facial skin, while the relative features of noncancer cases were extracted from the complementary face region.
In this study, we built a face data set of patients with cancer and constructed a deep learning model to classify the faces of people with and those without cancer. We found that facial skin and adiposity were closely related to the presence of cancer.
癌症已成为全球第二大死因。大多数癌症病例是由基因突变引起的,这些基因突变会影响新陈代谢并导致面部变化。
在本研究中,我们旨在使用深度学习技术识别癌症患者的面部特征。
收集癌症患者的面部图像以构建癌症面部图像数据集。根据癌症面部图像数据集的性别和年龄分布,从公开可用的MegaAge数据集中随机选择图像,构建无癌人群的面部图像数据集。对每张面部图像进行预处理,以获得直立居中的面部芯片,然后滤除背景以排除无关因素的影响。构建一个残差神经网络来对癌症和非癌症病例进行分类。使用迁移学习、小批量、少轮次、L2正则化和随机失活训练策略来防止过拟合。此外,使用引导梯度加权类激活映射来揭示相关特征。
2018年1月至2019年1月共收集了8124例癌症患者的面部图像(男性:n = 3851,47.4%;女性:n = 4273,52.6%)。患者年龄范围为1岁至70岁(中位年龄52岁)。通过地标比较发现,男性和女性癌症患者的平均面部比无癌人群的平均面部表现出更明显的面部脂肪堆积。在测试数据集时,训练过程在5轮后终止。受试者操作特征曲线下面积为0.94,准确率为0.82。癌症病例的主要相关特征是面部皮肤,而非癌症病例的相关特征则从面部的互补区域提取。
在本研究中,我们构建了癌症患者的面部数据集,并构建了一个深度学习模型来对有癌和无癌人群的面部进行分类。我们发现面部皮肤和脂肪堆积与癌症的存在密切相关。