Electrical and Computer Engineering Department, University of California, Los Angeles, Los Angeles, CA, USA.
Bioengineering Department, University of California, Los Angeles, Los Angeles, CA, USA.
Nat Methods. 2019 Dec;16(12):1323-1331. doi: 10.1038/s41592-019-0622-5. Epub 2019 Nov 4.
We demonstrate that a deep neural network can be trained to virtually refocus a two-dimensional fluorescence image onto user-defined three-dimensional (3D) surfaces within the sample. Using this method, termed Deep-Z, we imaged the neuronal activity of a Caenorhabditis elegans worm in 3D using a time sequence of fluorescence images acquired at a single focal plane, digitally increasing the depth-of-field by 20-fold without any axial scanning, additional hardware or a trade-off of imaging resolution and speed. Furthermore, we demonstrate that this approach can correct for sample drift, tilt and other aberrations, all digitally performed after the acquisition of a single fluorescence image. This framework also cross-connects different imaging modalities to each other, enabling 3D refocusing of a single wide-field fluorescence image to match confocal microscopy images acquired at different sample planes. Deep-Z has the potential to improve volumetric imaging speed while reducing challenges relating to sample drift, aberration and defocusing that are associated with standard 3D fluorescence microscopy.
我们证明,深度神经网络可以经过训练,将二维荧光图像虚拟聚焦到样本内用户定义的三维(3D)表面上。我们使用这种称为 Deep-Z 的方法,通过在单个焦平面上获取的荧光图像时间序列,对秀丽隐杆线虫(Caenorhabditis elegans)的神经元活动进行了 3D 成像,在不进行任何轴向扫描、不增加额外硬件的情况下,将景深增加了 20 倍,也没有牺牲成像分辨率和速度。此外,我们证明,这种方法可以校正样本漂移、倾斜和其他像差,所有这些都可以在获取单个荧光图像后进行数字处理。该框架还可以将不同的成像模式相互连接,使得单个宽场荧光图像的 3D 重聚焦能够与在不同样本平面上获取的共聚焦显微镜图像匹配。Deep-Z 有可能在提高体积成像速度的同时,减少与标准 3D 荧光显微镜相关的样本漂移、像差和散焦等挑战。
Light Sci Appl. 2021-3-23
Nat Methods. 2018-12-17
J Microsc. 2021-3
Opt Lett. 2009-5-15
Front Med (Lausanne). 2025-7-16
Light Sci Appl. 2025-6-12
iScience. 2025-3-14
Sci Adv. 2025-1-3
Light Sci Appl. 2025-1-1
Microsyst Nanoeng. 2024-12-24
Nat Methods. 2019-4-29
Nat Methods. 2018-12-17
Opt Express. 2018-11-12
PLoS Comput Biol. 2017-5-18
Methods. 2017-2-15
PLoS Comput Biol. 2016-6-6
Proc Natl Acad Sci U S A. 2016-2-23