Eisenstat Joshua, Wagner Matthias W, Vidarsson Logi, Ertl-Wagner Birgit, Sussman Dafna
Department of Electrical, Computer and Biomedical Engineering, Faculty of Engineering and Architectural Sciences, Toronto Metropolitan University, Toronto, ON M5G 1X8, Canada.
Division of Neuroradiology, The Hospital for Sick Children, Toronto, ON M5G 1X8, Canada.
Bioengineering (Basel). 2023 Jan 20;10(2):140. doi: 10.3390/bioengineering10020140.
Identifying fetal orientation is essential for determining the mode of delivery and for sequence planning in fetal magnetic resonance imaging (MRI). This manuscript describes a deep learning algorithm named Fet-Net, composed of convolutional neural networks (CNNs), which allows for the automatic detection of fetal orientation from a two-dimensional (2D) MRI slice. The architecture consists of four convolutional layers, which feed into a simple artificial neural network. Compared with eleven other prominent CNNs (different versions of ResNet, VGG, Xception, and Inception), Fet-Net has fewer architectural layers and parameters. From 144 3D MRI datasets indicative of vertex, breech, oblique and transverse fetal orientations, 6120 2D MRI slices were extracted to train, validate and test Fet-Net. Despite its simpler architecture, Fet-Net demonstrated an average accuracy and F1 score of 97.68% and a loss of 0.06828 on the 6120 2D MRI slices during a 5-fold cross-validation experiment. This architecture outperformed all eleven prominent architectures ( < 0.05). An ablation study proved each component's statistical significance and contribution to Fet-Net's performance. Fet-Net demonstrated robustness in classification accuracy even when noise was introduced to the images, outperforming eight of the 11 prominent architectures. Fet-Net's ability to automatically detect fetal orientation can profoundly decrease the time required for fetal MRI acquisition.
识别胎儿方位对于确定分娩方式以及胎儿磁共振成像(MRI)中的序列规划至关重要。本文描述了一种名为Fet-Net的深度学习算法,它由卷积神经网络(CNN)组成,能够从二维(2D)MRI切片中自动检测胎儿方位。该架构由四个卷积层组成,这些卷积层输入到一个简单的人工神经网络中。与其他十一种著名的CNN(不同版本的ResNet、VGG、Xception和Inception)相比,Fet-Net的架构层和参数更少。从144个指示胎儿头位、臀位、斜位和横位的3D MRI数据集中,提取了6120个2D MRI切片来训练、验证和测试Fet-Net。尽管其架构更简单,但在五折交叉验证实验中,Fet-Net在6120个2D MRI切片上的平均准确率和F1分数为97.68%,损失为0.06828。该架构优于所有十一种著名的架构(P<0.05)。一项消融研究证明了每个组件对Fet-Net性能的统计学意义和贡献。即使在图像中引入噪声时,Fet-Net在分类准确率方面也表现出稳健性,优于11种著名架构中的8种。Fet-Net自动检测胎儿方位的能力可以显著减少胎儿MRI采集所需的时间。