Suppr超能文献

用于磁共振图像中脑肿瘤分割的最新卷积神经网络优化器

State-of-the-Art CNN Optimizer for Brain Tumor Segmentation in Magnetic Resonance Images.

作者信息

Yaqub Muhammad, Jinchao Feng, Zia M Sultan, Arshid Kaleem, Jia Kebin, Rehman Zaka Ur, Mehmood Atif

机构信息

Faculty of Information Technology, Beijing University of Technology, Beijing 100000, China.

Department of Computer science and IT, The University of Lahore, Gujrat campus, Main GT Road, Adjacent Chenab Bridge Gujrat, Gujranwala, Punjab 52250, Pakistan.

出版信息

Brain Sci. 2020 Jul 3;10(7):427. doi: 10.3390/brainsci10070427.

Abstract

Brain tumors have become a leading cause of death around the globe. The main reason for this epidemic is the difficulty conducting a timely diagnosis of the tumor. Fortunately, magnetic resonance images (MRI) are utilized to diagnose tumors in most cases. The performance of a Convolutional Neural Network (CNN) depends on many factors (i.e., weight initialization, optimization, batches and epochs, learning rate, activation function, loss function, and network topology), data quality, and specific combinations of these model attributes. When we deal with a segmentation or classification problem, utilizing a single optimizer is considered weak testing or validity unless the decision of the selection of an optimizer is backed up by a strong argument. Therefore, optimizer selection processes are considered important to validate the usage of a single optimizer in order to attain these decision problems. In this paper, we provides a comprehensive comparative analysis of popular optimizers of CNN to benchmark the segmentation for improvement. In detail, we perform a comparative analysis of 10 different state-of-the-art gradient descent-based optimizers, namely Adaptive Gradient (Adagrad), Adaptive Delta (AdaDelta), Stochastic Gradient Descent (SGD), Adaptive Momentum (Adam), Cyclic Learning Rate (CLR), Adaptive Max Pooling (Adamax), Root Mean Square Propagation (RMS Prop), Nesterov Adaptive Momentum (Nadam), and Nesterov accelerated gradient (NAG) for CNN. The experiments were performed on the BraTS2015 data set. The Adam optimizer had the best accuracy of 99.2% in enhancing the CNN ability in classification and segmentation.

摘要

脑肿瘤已成为全球主要死因。这种流行趋势的主要原因是肿瘤的及时诊断存在困难。幸运的是,在大多数情况下,磁共振成像(MRI)被用于诊断肿瘤。卷积神经网络(CNN)的性能取决于许多因素(即权重初始化、优化、批次和轮次、学习率、激活函数、损失函数以及网络拓扑)、数据质量以及这些模型属性的特定组合。当我们处理分割或分类问题时,除非选择优化器的决定有充分依据,否则使用单一优化器被认为是弱测试或无效的。因此,优化器选择过程对于验证单一优化器的使用以解决这些决策问题很重要。在本文中,我们对CNN的流行优化器进行了全面的比较分析,以对分割进行基准测试以实现改进。具体而言,我们对10种不同的基于梯度下降的最先进优化器进行了比较分析,即自适应梯度(Adagrad)、自适应增量(AdaDelta)、随机梯度下降(SGD)、自适应动量(Adam)、循环学习率(CLR)、自适应最大池化(Adamax)、均方根传播(RMS Prop)、Nesterov自适应动量(Nadam)和Nesterov加速梯度(NAG)用于CNN。实验在BraTS2015数据集上进行。Adam优化器在增强CNN分类和分割能力方面具有99.2%的最佳准确率。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/33cf/7407771/ead5987802d5/brainsci-10-00427-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验