Lee Junyeop, Ham Insung, Kim Yongmin, Ko Hanseok
School of Electrical Engineering, Korea University, Seoul 02841, Republic of Korea.
Sensors (Basel). 2024 Dec 11;24(24):7932. doi: 10.3390/s24247932.
In this study, we propose a novel framework for time-series representation learning that integrates a learnable masking-augmentation strategy into a contrastive learning framework. Time-series data pose challenges due to their temporal dependencies and feature-extraction complexities. To address these challenges, we introduce a masking-based reconstruction approach within a contrastive learning context, aiming to enhance the model's ability to learn discriminative temporal features. Our method leverages self-supervised learning to effectively capture both global and local patterns by strategically masking segments of the time-series data and reconstructing them, which aids in revealing nuanced temporal dependencies. We utilize learnable masking as a dynamic augmentation technique, which enables the model to optimize contextual relationships in the data and extract meaningful representations that are both context-aware and robust. Extensive experiments were conducted on multiple time-series datasets, including SleepEDF-78, 20, UCI-HAR, achieving improvements of 2%, 2.55%, and 3.89% each and similar performance on Epilepsy in accuracy over baseline methods. Our results show significant performance gains compared to existing methods, highlighting the potential of our framework to advance the field of time-series analysis by improving the quality of learned representations and enhancing downstream task performance.
在本研究中,我们提出了一种用于时间序列表示学习的新颖框架,该框架将可学习的掩码增强策略集成到对比学习框架中。时间序列数据由于其时间依赖性和特征提取复杂性而带来挑战。为应对这些挑战,我们在对比学习背景下引入基于掩码的重建方法,旨在增强模型学习判别性时间特征的能力。我们的方法利用自监督学习,通过有策略地掩码时间序列数据的片段并进行重建,有效地捕捉全局和局部模式,这有助于揭示细微的时间依赖性。我们将可学习掩码用作动态增强技术,使模型能够优化数据中的上下文关系,并提取具有上下文感知且稳健的有意义表示。我们在多个时间序列数据集上进行了广泛实验,包括SleepEDF - 78、20、UCI - HAR,与基线方法相比,准确率分别提高了2%、2.55%和3.89%,在癫痫数据集上也有类似的性能提升。我们的结果表明,与现有方法相比,性能有显著提高,突出了我们的框架通过提高学习表示的质量和增强下游任务性能来推动时间序列分析领域发展的潜力。