Department of Biomedical Data Science, Dartmouth College, Hanover, New Hampshire.
Department of Computer Science, Dartmouth College, Hanover, New Hampshire.
JAMA Netw Open. 2020 Apr 1;3(4):e203398. doi: 10.1001/jamanetworkopen.2020.3398.
Histologic classification of colorectal polyps plays a critical role in screening for colorectal cancer and care of affected patients. An accurate and automated algorithm for the classification of colorectal polyps on digitized histopathologic slides could benefit practitioners and patients.
To evaluate the performance and generalizability of a deep neural network for colorectal polyp classification on histopathologic slide images using a multi-institutional data set.
DESIGN, SETTING, AND PARTICIPANTS: This prognostic study used histopathologic slides collected from January 1, 2016, to June 31, 2016, from Dartmouth-Hitchcock Medical Center, Lebanon, New Hampshire, with 326 slides used for training, 157 slides for an internal data set, and 25 for a validation set. For the external data set, 238 slides for 179 distinct patients were obtained from 24 institutions across 13 US states. Data analysis was performed from April 9 to November 23, 2019.
Accuracy, sensitivity, and specificity of the model to classify 4 major colorectal polyp types: tubular adenoma, tubulovillous or villous adenoma, hyperplastic polyp, and sessile serrated adenoma. Performance was compared with that of local pathologists' at the point of care identified from corresponding pathology laboratories.
For the internal evaluation on the 157 slides with ground truth labels from 5 pathologists, the deep neural network had a mean accuracy of 93.5% (95% CI, 89.6%-97.4%) compared with local pathologists' accuracy of 91.4% (95% CI, 87.0%-95.8%). On the external test set of 238 slides with ground truth labels from 5 pathologists, the deep neural network achieved an accuracy of 87.0% (95% CI, 82.7%-91.3%), which was comparable with local pathologists' accuracy of 86.6% (95% CI, 82.3%-90.9%).
The findings suggest that this model may assist pathologists by improving the diagnostic efficiency, reproducibility, and accuracy of colorectal cancer screenings.
结直肠息肉的组织学分类在结直肠癌的筛查和患者的治疗中起着至关重要的作用。一种能够准确、自动地对数字化组织病理学切片中的结直肠息肉进行分类的算法,可以使医生和患者受益。
使用多机构数据集评估深度学习网络在组织病理学幻灯片图像上进行结直肠息肉分类的性能和泛化能力。
设计、设置和参与者:这项预后研究使用了 2016 年 1 月 1 日至 6 月 31 日期间从新罕布什尔州黎巴嫩的达特茅斯-希区柯克医疗中心收集的组织病理学幻灯片,其中 326 张用于训练,157 张用于内部数据集,25 张用于验证集。对于外部数据集,从美国 13 个州的 24 个机构获得了 238 张属于 179 个不同患者的幻灯片。数据分析于 2019 年 4 月 9 日至 11 月 23 日进行。
该模型对管状腺瘤、管状绒毛状或绒毛状腺瘤、增生性息肉和无蒂锯齿状腺瘤这 4 种主要结直肠息肉类型进行分类的准确性、敏感性和特异性。将其与在对应病理实验室的护理点识别的当地病理学家的表现进行了比较。
在对 157 张来自 5 位病理学家的有地面真实标签的幻灯片进行的内部评估中,与当地病理学家的准确性(95%CI,87.0%-90.9%)相比,深度神经网络的平均准确率为 93.5%(95%CI,89.6%-97.4%)。在外部测试集的 238 张来自 5 位病理学家的有地面真实标签的幻灯片中,深度神经网络的准确率为 87.0%(95%CI,82.7%-91.3%),与当地病理学家的准确率相当。
研究结果表明,该模型可能通过提高结直肠癌筛查的诊断效率、可重复性和准确性来帮助病理学家。