Suppr超能文献

轻量级Transformer在癫痫发作预测方面表现出与大型语言模型相当的性能:关于脑电图数据轻量级模型的一个实例

Lightweight Transformer exhibits comparable performance to LLMs for Seizure Prediction: A case for light-weight models for EEG data.

作者信息

Parani Paras, Mohammad Umair, Saeed Fahad

机构信息

Knight Foundation School of Computing and Information Sciences, Florida International University, Miami, FL 33172, USA.

出版信息

Proc IEEE Int Conf Big Data. 2024 Dec;2024:4941-4945. doi: 10.1109/bigdata62323.2024.10825319.

Abstract

Predicting seizures ahead of time will have a significant positive clinical impact for people with epilepsy. Advances in machine learning/artificial intelligence (ML/AI) has provided us the tools needed to perform such predictive tasks. To date, advanced deep learning (DL) architectures such as the convolutional neural network (CNN) and long short-term memory (LSTM) have been used with mixed results. However, highly connected activity exhibited by epileptic seizures necessitates the design of more complex ML techniques which can better capture the complex interconnected neurological processes. Other challenges include the variability of EEG sensor data quality, different epilepsy and seizure profiles, lack of annotated datasets and absence of ML-ready benchmarks. In addition, successful models will need to perform inference in almost real-time using limited hardware compute-capacity. To address these challenges, we propose a lightweight architecture, called , whose novelty lies in the simple and smaller model-size and a lower computational load footprint needed to infer in real-time compared to other works in the literature. To quantify the performance of this lightweight model, we compared its performance with a custom-designed residual neural network (ResNet), a pre-trained vision transformer (ViT) and a pre-trained large-language model (LLM). We tested ESPFormer on MLSPred-Bench which is the largest patient-independent seizure prediction dataset comprising 12 benchmarks. Our results demonstrate that ESPFormer provides the best performance in terms of prediction accuracy for 4/12 benchmarks with an average improvement of 2.65% compared to the LLM, 3.35% compared to the ViT and 17.65% compared to the ResNet - and comparable results for other benchmarks. Our results indicate that lightweight transformer architecture may outperform resource-intensive LLM based models for real-time EEG-based seizure predictions.

摘要

提前预测癫痫发作对癫痫患者具有重大的积极临床意义。机器学习/人工智能(ML/AI)的发展为我们提供了执行此类预测任务所需的工具。迄今为止,诸如卷积神经网络(CNN)和长短期记忆(LSTM)等先进的深度学习(DL)架构的使用效果参差不齐。然而,癫痫发作所表现出的高度关联活动需要设计更复杂的ML技术,以便更好地捕捉复杂的相互连接的神经过程。其他挑战包括脑电图传感器数据质量的可变性、不同的癫痫和发作特征、缺乏带注释的数据集以及缺乏适用于ML的基准。此外,成功的模型需要使用有限的硬件计算能力几乎实时地进行推理。为了应对这些挑战,我们提出了一种轻量级架构,称为ESPFormer,其新颖之处在于模型简单、尺寸更小,并且与文献中的其他工作相比,实时推理所需的计算负载更小。为了量化这个轻量级模型的性能,我们将其性能与定制设计的残差神经网络(ResNet)、预训练的视觉Transformer(ViT)和预训练的大语言模型(LLM)进行了比较。我们在MLSPred-Bench上测试了ESPFormer,MLSPred-Bench是最大的独立于患者的癫痫发作预测数据集,包含12个基准。我们的结果表明,在4/12个基准的预测准确性方面,ESPFormer提供了最佳性能,与LLM相比平均提高了2.65%,与ViT相比提高了3.35%,与ResNet相比提高了17.65%,并且在其他基准上也取得了可比的结果。我们的结果表明,轻量级Transformer架构在基于脑电图的实时癫痫发作预测方面可能优于资源密集型的基于LLM的模型。

相似文献

2
Utilizing Pretrained Vision Transformers and Large Language Models for Epileptic Seizure Prediction.利用预训练视觉变换器和大语言模型进行癫痫发作预测。
2025 8th Int Conf Data Sci Mach Learn Appl (2025). 2025 Feb;2025:132-137. doi: 10.1109/cdma61895.2025.00028. Epub 2025 Mar 7.

本文引用的文献

3
An Epileptic Seizure Prediction Method Based on CBAM-3D CNN-LSTM Model.基于 CBAM-3D CNN-LSTM 模型的癫痫发作预测方法。
IEEE J Transl Eng Health Med. 2023 Jun 27;11:417-423. doi: 10.1109/JTEHM.2023.3290036. eCollection 2023.
7
Efficient Epileptic Seizure Prediction Based on Deep Learning.基于深度学习的高效癫痫发作预测。
IEEE Trans Biomed Circuits Syst. 2019 Oct;13(5):804-813. doi: 10.1109/TBCAS.2019.2929053. Epub 2019 Jul 17.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验