Machireddy Archana, Thibault Guillaume, Loftis Kevin G, Stoltz Kevin, Bueno Cecilia E, Smith Hannah R, Riesterer Jessica L, Gray Joe W, Song Xubo
Program of Computer Science and Electrical Engineering, Oregon Health and Science University, Portland, OR, United States.
Department of Biomedical Engineering, Oregon Health and Science University, Portland, OR, United States.
Front Bioinform. 2023 Dec 15;3:1308708. doi: 10.3389/fbinf.2023.1308708. eCollection 2023.
Focused ion beam-scanning electron microscopy (FIB-SEM) images can provide a detailed view of the cellular ultrastructure of tumor cells. A deeper understanding of their organization and interactions can shed light on cancer mechanisms and progression. However, the bottleneck in the analysis is the delineation of the cellular structures to enable quantitative measurements and analysis. We mitigated this limitation using deep learning to segment cells and subcellular ultrastructure in 3D FIB-SEM images of tumor biopsies obtained from patients with metastatic breast and pancreatic cancers. The ultrastructures, such as nuclei, nucleoli, mitochondria, endosomes, and lysosomes, are relatively better defined than their surroundings and can be segmented with high accuracy using a neural network trained with sparse manual labels. Cell segmentation, on the other hand, is much more challenging due to the lack of clear boundaries separating cells in the tissue. We adopted a multi-pronged approach combining detection, boundary propagation, and tracking for cell segmentation. Specifically, a neural network was employed to detect the intracellular space; optical flow was used to propagate cell boundaries across the z-stack from the nearest ground truth image in order to facilitate the separation of individual cells; finally, the filopodium-like protrusions were tracked to the main cells by calculating the intersection over union measure for all regions detected in consecutive images along z-stack and connecting regions with maximum overlap. The proposed cell segmentation methodology resulted in an average Dice score of 0.93. For nuclei, nucleoli, and mitochondria, the segmentation achieved Dice scores of 0.99, 0.98, and 0.86, respectively. The segmentation of FIB-SEM images will enable interpretative rendering and provide quantitative image features to be associated with relevant clinical variables.
聚焦离子束扫描电子显微镜(FIB-SEM)图像可以提供肿瘤细胞的细胞超微结构的详细视图。对其组织结构和相互作用的更深入理解可以揭示癌症机制和进展。然而,分析中的瓶颈在于细胞结构的描绘,以便进行定量测量和分析。我们利用深度学习对从转移性乳腺癌和胰腺癌患者获得的肿瘤活检组织的三维FIB-SEM图像中的细胞和亚细胞超微结构进行分割,从而缓解了这一限制。细胞核、核仁、线粒体、内体和溶酶体等超微结构相对于其周围环境定义得相对较好,可以使用经过稀疏手动标记训练的神经网络进行高精度分割。另一方面,由于组织中细胞之间缺乏清晰的边界,细胞分割更具挑战性。我们采用了一种多管齐下的方法,结合检测、边界传播和跟踪来进行细胞分割。具体而言,使用神经网络检测细胞内空间;利用光流从最近的真实图像跨z轴堆栈传播细胞边界,以促进单个细胞的分离;最后,通过计算沿z轴堆栈在连续图像中检测到的所有区域的交并比,并连接具有最大重叠的区域,将丝状伪足样突起追踪到主要细胞。所提出的细胞分割方法平均Dice评分为0.93。对于细胞核、核仁线粒体,分割的Dice评分分别为0.99、0.98和0.86。FIB-SEM图像的分割将实现解释性渲染,并提供与相关临床变量相关的定量图像特征。