The Epilepsy Center, Department of Neurosurgery, The First Affiliated Hospital of USTC, Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei 230001, China.
School of Information Science and Technology, University of Science and Technology of China, Hefei 230027, China.
J Healthc Eng. 2022 Jan 25;2022:1573076. doi: 10.1155/2022/1573076. eCollection 2022.
Early prediction of epilepsy seizures can warn the patients to take precautions and improve their lives significantly. In recent years, deep learning has become increasingly predominant in seizure prediction. However, existing deep learning-based approaches in this field require a great deal of labeled data to guarantee performance. At the same time, labeling EEG signals does require the expertise of an experienced pathologist and is incredibly time-consuming. To address this issue, we propose a novel Consistency-based Semisupervised Seizure Prediction Model (CSSPM), where only a fraction of training data is labeled. Our method is based on the principle of consistency regularization, which underlines that a robust model should maintain consistent results for the same input under extra perturbations. Specifically, by using stochastic augmentation and dropout, we consider the entire neural network as a stochastic model and apply a consistency constraint to penalize the difference between the current prediction and previous predictions. In this way, unlabeled data could be fully utilized to improve the decision boundary and enhance prediction performance. Compared with existing studies requiring all training data to be labeled, the proposed method only needs a small portion of data to be labeled while still achieving satisfactory results. Our method provides a promising solution to alleviate the labeling cost for real-world applications.
早期预测癫痫发作可以警告患者采取预防措施,显著改善他们的生活。近年来,深度学习在癫痫预测中变得越来越重要。然而,该领域现有的基于深度学习的方法需要大量的标记数据来保证性能。同时,对 EEG 信号进行标记确实需要有经验的病理学家的专业知识,而且非常耗时。为了解决这个问题,我们提出了一种新颖的基于一致性的半监督癫痫预测模型 (CSSPM),其中只有一小部分训练数据是有标记的。我们的方法基于一致性正则化原理,该原理强调一个健壮的模型应该在额外的扰动下对相同的输入保持一致的结果。具体来说,通过使用随机增强和随机失活,我们将整个神经网络视为一个随机模型,并应用一致性约束来惩罚当前预测与之前预测之间的差异。通过这种方式,可以充分利用未标记的数据来改善决策边界并提高预测性能。与需要所有训练数据都标记的现有研究相比,所提出的方法只需要一小部分数据进行标记,同时仍能取得令人满意的结果。我们的方法为缓解实际应用中的标记成本提供了一个有前途的解决方案。