Suppr超能文献

nnU-Net:一种基于深度学习的生物医学图像分割的自配置方法。

nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation.

机构信息

Division of Medical Image Computing, German Cancer Research Center, Heidelberg, Germany.

Faculty of Biosciences, University of Heidelberg, Heidelberg, Germany.

出版信息

Nat Methods. 2021 Feb;18(2):203-211. doi: 10.1038/s41592-020-01008-z. Epub 2020 Dec 7.

Abstract

Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is non-trivial and highly dependent on dataset properties and hardware conditions. We developed nnU-Net, a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task. The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. We make nnU-Net publicly available as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training.

摘要

生物医学成像是科学发现的驱动力,也是医疗保健的核心组成部分,并且受到深度学习领域的推动。虽然语义分割算法在许多应用中实现了图像分析和量化,但各自专门解决方案的设计并不简单,并且高度依赖于数据集属性和硬件条件。我们开发了 nnU-Net,这是一种基于深度学习的分割方法,它可以自动为任何新任务配置自身,包括预处理、网络架构、训练和后处理。在这个过程中的关键设计选择被建模为一组固定参数、相互依赖的规则和经验决策。无需人工干预,nnU-Net就超越了大多数现有的方法,包括在国际生物医学分割竞赛中使用的 23 个公共数据集上的高度专业化解决方案。我们将 nnU-Net 作为一个即插即用的工具公开提供,通过不需要专家知识或超出标准网络训练的计算资源,使最先进的分割技术为广大受众所使用。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验