Rasheed Haroon Adam, Davis Tyler, Morales Esteban, Fei Zhe, Grassi Lourdes, De Gainza Agustina, Nouri-Mahdavi Kouros, Caprioli Joseph
David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California.
Department of Computer Science, University of California Los Angeles, Los Angeles, California.
Ophthalmol Sci. 2022 Nov 3;3(1):100244. doi: 10.1016/j.xops.2022.100244. eCollection 2023 Mar.
Accurate neural rim measurement based on optic disc imaging is important to glaucoma severity grading and often performed by trained glaucoma specialists. We aim to improve upon existing automated tools by building a fully automated system (RimNet) for direct rim identification in glaucomatous eyes and measurement of the minimum rim-to-disc ratio (mRDR) in intact rims, the angle of absent rim width (ARW) in incomplete rims, and the rim-to-disc-area ratio (RDAR) with the goal of optic disc damage grading.
Retrospective cross sectional study.
One thousand and twenty-eight optic disc photographs with evidence of glaucomatous optic nerve damage from 1021 eyes of 903 patients with any form of primary glaucoma were included. The mean age was 63.7 (± 14.9) yrs. The average mean deviation of visual fields was -8.03 (± 8.59).
The images were required to be of adequate quality, have signs of glaucomatous damage, and be free of significant concurrent pathology as independently determined by glaucoma specialists. Rim and optic cup masks for each image were manually delineated by glaucoma specialists. The database was randomly split into 80/10/10 for training, validation, and testing, respectively. RimNet consists of a deep learning rim and cup segmentation model, a computer vision mRDR measurement tool for intact rims, and an ARW measurement tool for incomplete rims. The mRDR is calculated at the thinnest rim section while ARW is calculated in regions of total rim loss. The RDAR was also calculated. Evaluation on the Drishti-GS dataset provided external validation (Sivaswamy 2015).
Median Absolute Error (MAE) between glaucoma specialists and RimNet for mRDR and ARW.
On the test set, RimNet achieved a mRDR MAE of 0.03 (0.05), ARW MAE of 31 (89)°, and an RDAR MAE of 0.09 (0.10). On the Drishti-GS dataset, an mRDR MAE of 0.03 (0.04) and an mRDAR MAE of 0.09 (0.10) was observed.
RimNet demonstrated acceptably accurate rim segmentation and mRDR and ARW measurements. The fully automated algorithm presented here would be a valuable component in an automated mRDR-based glaucoma grading system. Further improvements could be made by improving identification and segmentation performance on incomplete rims and expanding the number and variety of glaucomatous training images.
基于视盘成像进行准确的神经边缘测量对于青光眼严重程度分级很重要,通常由训练有素的青光眼专家执行。我们旨在通过构建一个全自动系统(RimNet)来改进现有自动化工具,以直接识别青光眼患者眼睛中的边缘,并测量完整边缘的最小边缘与视盘比率(mRDR)、不完整边缘的边缘缺失宽度角度(ARW)以及边缘与视盘面积比率(RDAR),目标是对视盘损伤进行分级。
回顾性横断面研究。
纳入了903例患有任何形式原发性青光眼的患者的1021只眼睛的1028张有青光眼性视神经损伤证据的视盘照片。平均年龄为63.7(±14.9)岁。视野的平均平均偏差为-8.03(±8.59)。
图像需要具有足够的质量、有青光眼损伤的迹象,并且由青光眼专家独立确定没有明显的并发病变。青光眼专家手动勾勒出每个图像的边缘和视杯掩码。数据库随机分为80/10/10分别用于训练、验证和测试。RimNet由一个深度学习边缘和视杯分割模型、一个用于完整边缘的计算机视觉mRDR测量工具以及一个用于不完整边缘的ARW测量工具组成。mRDR在最薄的边缘部分计算,而ARW在边缘完全缺失的区域计算。还计算了RDAR。在Drishti-GS数据集上的评估提供了外部验证(Sivaswamy,2015年)。
青光眼专家与RimNet在mRDR和ARW方面的中位数绝对误差(MAE)。
在测试集上,RimNet的mRDR的MAE为0.03(0.05),ARW的MAE为31(89)°,RDAR的MAE为0.09(0.10)。在Drishti-GS数据集上,观察到mRDR的MAE为0.03(0.04),mRDAR的MAE为0.09(0.10)。
RimNet展示了可接受的准确边缘分割以及mRDR和ARW测量。这里提出的全自动算法将是基于mRDR的自动化青光眼分级系统中的一个有价值的组件。通过改进对不完整边缘的识别和分割性能以及扩大青光眼训练图像的数量和种类,可以进一步改进。