Department of Computer Science and Software Engineering, College of Information Technology, UAEU, Al Ain 15551, United Arab Emirates.
Department of Civil Engineering, College of Engineering, UAEU, Al Ain 15551, United Arab Emirates.
Sensors (Basel). 2021 Mar 1;21(5):1688. doi: 10.3390/s21051688.
This paper proposes a customized convolutional neural network for crack detection in concrete structures. The proposed method is compared to four existing deep learning methods based on training data size, data heterogeneity, network complexity, and the number of epochs. The performance of the proposed convolutional neural network (CNN) model is evaluated and compared to pretrained networks, i.e., the VGG-16, VGG-19, ResNet-50, and Inception V3 models, on eight datasets of different sizes, created from two public datasets. For each model, the evaluation considered computational time, crack localization results, and classification measures, e.g., accuracy, precision, recall, and F1-score. Experimental results demonstrated that training data size and heterogeneity among data samples significantly affect model performance. All models demonstrated promising performance on a limited number of diverse training data; however, increasing the training data size and reducing diversity reduced generalization performance, and led to overfitting. The proposed customized CNN and VGG-16 models outperformed the other methods in terms of classification, localization, and computational time on a small amount of data, and the results indicate that these two models demonstrate superior crack detection and localization for concrete structures.
本文提出了一种用于混凝土结构裂缝检测的定制卷积神经网络。该方法与四种现有的深度学习方法进行了比较,这些方法基于训练数据大小、数据异质性、网络复杂性和 epoch 数量。评估了所提出的卷积神经网络(CNN)模型的性能,并与预训练网络(即 VGG-16、VGG-19、ResNet-50 和 Inception V3 模型)在来自两个公共数据集的八个不同大小的数据集上进行了比较。对于每个模型,评估考虑了计算时间、裂缝定位结果以及分类指标,例如准确性、精度、召回率和 F1 分数。实验结果表明,训练数据大小和数据样本之间的异质性显著影响模型性能。所有模型在有限数量的多样化训练数据上表现出良好的性能;然而,增加训练数据大小和减少多样性会降低泛化性能,并导致过拟合。在所提出的少量数据上,定制的 CNN 和 VGG-16 模型在分类、定位和计算时间方面优于其他方法,结果表明这两个模型在混凝土结构的裂缝检测和定位方面表现出色。