Qamar Saqib, Malyshev Dmitry, Öberg Rasmus, Nilsson Daniel P G, Andersson Magnus
Department of Physics, Umeå University, 90187, Umeå, Sweden.
Integrated Science Lab, Umeå University, 90187, Umeå, Sweden.
Sci Rep. 2025 Jun 20;15(1):20177. doi: 10.1038/s41598-025-05900-6.
Analyzing microscopy images of large growing cell samples using traditional methods is a complex and time-consuming process. In this work, we have developed an attention-driven UNet-enhanced model using deep learning techniques to efficiently quantify the position, area, and circularity of bacterial spores and vegetative cells from images containing more than 10,000 bacterial cells. Our attention-driven UNet algorithm has an accuracy of 96%, precision of 82%, sensitivity of 81%, and specificity of 98%. Therefore, it can segment cells at a level comparable to manual annotation. We demonstrate the efficacy of this model by applying it to a live-dead decontamination assay. The model is provided in three formats: Python code, a Binder that operates within a web browser without needing installation, and a Flask Web application for local use.
使用传统方法分析大型生长细胞样本的显微镜图像是一个复杂且耗时的过程。在这项工作中,我们利用深度学习技术开发了一种注意力驱动的UNet增强模型,以有效地从包含超过10,000个细菌细胞的图像中量化细菌孢子和营养细胞的位置、面积和圆形度。我们的注意力驱动的UNet算法准确率为96%,精确率为82%,灵敏度为81%,特异性为98%。因此,它能够在与手动注释相当的水平上分割细胞。我们通过将该模型应用于死活去污测定来证明其有效性。该模型以三种格式提供:Python代码、无需安装即可在网页浏览器中运行的Binder以及供本地使用的Flask Web应用程序。