Suppr超能文献

使用深度学习开发和分析前列腺核心活检图像的计算苏木精和伊红染色,用于肿瘤诊断。

Use of Deep Learning to Develop and Analyze Computational Hematoxylin and Eosin Staining of Prostate Core Biopsy Images for Tumor Diagnosis.

机构信息

Program in Media Arts and Sciences, Massachusetts Institute of Technology, Cambridge.

Harvard Medical School, Brigham and Women's Hospital, Boston, Massachusetts.

出版信息

JAMA Netw Open. 2020 May 1;3(5):e205111. doi: 10.1001/jamanetworkopen.2020.5111.

Abstract

IMPORTANCE

Histopathological diagnoses of tumors from tissue biopsy after hematoxylin and eosin (H&E) dye staining is the criterion standard for oncological care, but H&E staining requires trained operators, dyes and reagents, and precious tissue samples that cannot be reused.

OBJECTIVES

To use deep learning algorithms to develop models that perform accurate computational H&E staining of native nonstained prostate core biopsy images and to develop methods for interpretation of H&E staining deep learning models and analysis of computationally stained images by computer vision and clinical approaches.

DESIGN, SETTING, AND PARTICIPANTS: This cross-sectional study used hundreds of thousands of native nonstained RGB (red, green, and blue channel) whole slide image (WSI) patches of prostate core tissue biopsies obtained from excess tissue material from prostate core biopsies performed in the course of routine clinical care between January 7, 2014, and January 7, 2017, at Brigham and Women's Hospital, Boston, Massachusetts. Biopsies were registered with their H&E-stained versions. Conditional generative adversarial neural networks (cGANs) that automate conversion of native nonstained RGB WSI to computational H&E-stained images were then trained. Deidentified whole slide images of prostate core biopsy and medical record data were transferred to Massachusetts Institute of Technology, Cambridge, for computational research. Results were shared with physicians for clinical evaluations. Data were analyzed from July 2018 to February 2019.

MAIN OUTCOMES AND MEASURES

Methods for detailed computer vision image analytics, visualization of trained cGAN model outputs, and clinical evaluation of virtually stained images were developed. The main outcome was interpretable deep learning models and computational H&E-stained images that achieved high performance in these metrics.

RESULTS

Among 38 patients who provided samples, single core biopsy images were extracted from each whole slide, resulting in 102 individual nonstained and H&E dye-stained image pairs that were compared with matched computationally stained and unstained images. Calculations showed high similarities between computationally and H&E dye-stained images, with a mean (SD) structural similarity index (SSIM) of 0.902 (0.026), Pearson correlation coefficient (PCC) of 0.962 (0.096), and peak signal to noise ratio (PSNR) of 22.821 (1.232) dB. A second cGAN performed accurate computational destaining of H&E-stained images back to their native nonstained form, with a mean (SD) SSIM of 0.900 (0.030), PCC of 0.963 (0.011), and PSNR of 25.646 (1.943) dB compared with native nonstained images. A single blind prospective study computed approximately 95% pixel-by-pixel overlap among prostate tumor annotations provided by 5 board certified pathologists on computationally stained images, compared with those on H&E dye-stained images. This study also used the first visualization and explanation of neural network kernel activation maps during H&E staining and destaining of RGB images by cGANs. High similarities between kernel activation maps of computationally and H&E-stained images (mean-squared errors <0.0005) provide additional mathematical and mechanistic validation of the staining system.

CONCLUSIONS AND RELEVANCE

These findings suggest that computational H&E staining of native unlabeled RGB images of prostate core biopsy could reproduce Gleason grade tumor signatures that were easily assessed and validated by clinicians. Methods for benchmarking, visualization, and clinical validation of deep learning models and virtually H&E-stained images communicated in this study have wide applications in clinical informatics and oncology research. Clinical researchers may use these systems for early indications of possible abnormalities in native nonstained tissue biopsies prior to histopathological workflows.

摘要

重要性:苏木精和伊红(H&E)染色后组织活检的肿瘤组织病理学诊断是肿瘤学护理的标准,但 H&E 染色需要经过培训的操作人员、染料和试剂,以及不能重复使用的珍贵组织样本。

目的:使用深度学习算法开发模型,对未染色的前列腺核心活检图像进行准确的计算 H&E 染色,并开发解释 H&E 染色深度学习模型的方法,以及通过计算机视觉和临床方法分析计算染色图像。

设计、设置和参与者:这项横断面研究使用了数十万张来自马萨诸塞州波士顿布莱根妇女医院的前列腺核心组织活检多余组织材料的未染色的原生 RGB(红色、绿色和蓝色通道)全幻灯片图像(WSI)补丁,这些组织活检是在常规临床护理过程中于 2014 年 1 月 7 日至 2017 年 1 月 7 日获得的。活检与它们的 H&E 染色版本进行了注册。然后训练条件生成对抗网络(cGAN),自动将原生非染色的 RGB WSI 转换为计算 H&E 染色的图像。未识别的前列腺核心活检全幻灯片图像和病历数据被转移到马萨诸塞州剑桥的麻省理工学院进行计算研究。结果与医生共享,以进行临床评估。数据于 2018 年 7 月至 2019 年 2 月进行分析。

主要结果和措施:开发了详细的计算机视觉图像分析方法、训练有素的 cGAN 模型输出的可视化以及虚拟染色图像的临床评估方法。主要结果是可解释的深度学习模型和计算 H&E 染色图像,这些模型在这些指标上表现出了很高的性能。

结果:在 38 名提供样本的患者中,从每个全幻灯片中提取单个核心活检图像,从而比较了 102 对未染色和 H&E 染色的图像对与匹配的计算染色和未染色的图像。计算结果表明,计算和 H&E 染色图像之间具有高度相似性,结构相似性指数(SSIM)平均值(标准差)为 0.902(0.026),皮尔逊相关系数(PCC)为 0.962(0.096),峰值信噪比(PSNR)为 22.821(1.232)dB。第二个 cGAN 能够准确地对 H&E 染色的图像进行计算去染色,使其恢复为原生未染色的形式,与原生未染色的图像相比,其 SSIM 平均值(标准差)为 0.900(0.030),PCC 为 0.963(0.011),PSNR 为 25.646(1.943)dB。一项单盲前瞻性研究计算了大约 95%的前列腺肿瘤注释像素级重叠,这些注释由 5 名具有董事会认证的病理学家在计算染色图像上提供,而不是在 H&E 染色图像上提供。本研究还首次通过 cGAN 对 RGB 图像的 H&E 染色和去染色过程中的神经网络核激活图进行了可视化和解释。计算和 H&E 染色图像的核激活图之间的高度相似性(均方误差<0.0005)为染色系统提供了额外的数学和机械验证。

结论和相关性:这些发现表明,对前列腺核心活检的原生未标记的 RGB 图像进行计算 H&E 染色可以重现易于评估和验证的 Gleason 级肿瘤特征。本研究中交流的深度学习模型和虚拟 H&E 染色图像的基准测试、可视化和临床验证方法在临床信息学和肿瘤学研究中有广泛的应用。临床研究人员可以在组织病理学工作流程之前,使用这些系统来早期发现原生非染色组织活检中可能存在的异常。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/39fa/7240356/088b37949094/jamanetwopen-3-e205111-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验