• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

DGMSCL:一种用于类不平衡多元时间序列分类的动态图混合监督对比学习方法。

DGMSCL: A dynamic graph mixed supervised contrastive learning approach for class imbalanced multivariate time series classification.

作者信息

Qian Lipeng, Zuo Qiong, Li Dahu, Zhu Hong

机构信息

School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan, 430070, Hubei, China.

School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan, 430070, Hubei, China.

出版信息

Neural Netw. 2025 May;185:107131. doi: 10.1016/j.neunet.2025.107131. Epub 2025 Jan 17.

DOI:10.1016/j.neunet.2025.107131
PMID:39862528
Abstract

In the Imbalanced Multivariate Time Series Classification (ImMTSC) task, minority-class instances typically correspond to critical events, such as system faults in power grids or abnormal health occurrences in medical monitoring. Despite being rare and random, these events are highly significant. The dynamic spatial-temporal relationships between minority-class instances and other instances make them more prone to interference from neighboring instances during classification. Increasing the number of minority-class samples during training often results in overfitting to a single pattern of the minority class. Contrastive learning ensures that majority-class instances learn similar features in the representation space. However, it does not effectively aggregate features from neighboring minority-class instances, hindering its ability to properly represent these instances in the ImMTS dataset. Therefor, we propose a dynamic graph-based mixed supervised contrastive learning method (DGMSCL) that effectively fits minority-class features without increasing their number, while also separating them from other instances in the representation space. First, it reconstructs the input sequence into dynamic graphs and employs a hierarchical attention graph neural network (HAGNN) to generate a discriminative embedding representation between instances. Based on this, we introduce a novel mixed contrast loss, which includes weight-augmented inter-graph supervised contrast (WAIGC) and context-based minority class-aware contrast (MCAC). It adjusts the sample weights based on their quantity and intrinsic characteristics, placing greater emphasis on minority-class loss to produce more effective gradient gains during training. Additionally, it separates minority-class instances from adjacent transitional instances in the representation space, enhancing their representational capacity. Extensive experiments across various scenarios and datasets with differing degrees of imbalance demonstrate that DGMSCL consistently outperforms existing baseline models. Specifically, DGMSCL achieves higher overall classification accuracy, as evidenced by significantly improved average F1-score, G-mean, and kappa coefficient across multiple datasets. Moreover, classification results on a real-world power data show that DGMSCL generalizes well to real-world application.

摘要

在不平衡多变量时间序列分类(ImMTSC)任务中,少数类实例通常对应于关键事件,例如电网中的系统故障或医疗监测中的异常健康状况。尽管这些事件罕见且随机,但却非常重要。少数类实例与其他实例之间的动态时空关系使得它们在分类过程中更容易受到相邻实例的干扰。在训练期间增加少数类样本的数量通常会导致过度拟合少数类的单一模式。对比学习可确保多数类实例在表示空间中学习相似的特征。然而,它不能有效地聚合相邻少数类实例的特征,从而阻碍了其在ImMTS数据集中正确表示这些实例的能力。因此,我们提出了一种基于动态图的混合监督对比学习方法(DGMSCL),该方法无需增加少数类特征的数量就能有效地拟合它们,同时还能在表示空间中将它们与其他实例区分开来。首先,它将输入序列重建为动态图,并采用分层注意力图神经网络(HAGNN)在实例之间生成有区分力的嵌入表示。在此基础上,我们引入了一种新颖的混合对比损失,其中包括权重增强的图间监督对比(WAIGC)和基于上下文的少数类感知对比(MCAC)。它根据样本的数量和内在特征调整样本权重,更加重视少数类损失,以便在训练期间产生更有效的梯度增益。此外,它在表示空间中将少数类实例与相邻的过渡实例区分开来,增强了它们的表示能力。在各种场景和不同程度不平衡的数据集上进行的大量实验表明,DGMSCL始终优于现有的基线模型。具体而言,DGMSCL实现了更高的总体分类准确率,多个数据集的平均F1分数、G均值和kappa系数显著提高就证明了这一点。此外,在真实世界电力数据上的分类结果表明,DGMSCL能够很好地推广到实际应用中。

相似文献

1
DGMSCL: A dynamic graph mixed supervised contrastive learning approach for class imbalanced multivariate time series classification.DGMSCL:一种用于类不平衡多元时间序列分类的动态图混合监督对比学习方法。
Neural Netw. 2025 May;185:107131. doi: 10.1016/j.neunet.2025.107131. Epub 2025 Jan 17.
2
MediDRNet: Tackling category imbalance in diabetic retinopathy classification with dual-branch learning and prototypical contrastive learning.MediDRNet:使用双分支学习和原型对比学习解决糖尿病视网膜病变分类中的类别不平衡问题。
Comput Methods Programs Biomed. 2024 Aug;253:108230. doi: 10.1016/j.cmpb.2024.108230. Epub 2024 May 17.
3
Universal representation learning for multivariate time series using the instance-level and cluster-level supervised contrastive learning.使用实例级和聚类级监督对比学习的多变量时间序列通用表示学习。
Data Min Knowl Discov. 2024 May;38(3):1493-1519. doi: 10.1007/s10618-024-01006-1. Epub 2024 Feb 9.
4
Contrastive learning of graphs under label noise.图在标签噪声下的对比学习。
Neural Netw. 2024 Apr;172:106113. doi: 10.1016/j.neunet.2024.106113. Epub 2024 Jan 6.
5
Bidirectional consistency with temporal-aware for semi-supervised time series classification.具有时间感知的双向一致性的半监督时间序列分类。
Neural Netw. 2024 Dec;180:106709. doi: 10.1016/j.neunet.2024.106709. Epub 2024 Sep 7.
6
Local structure-aware graph contrastive representation learning.基于局部结构感知的图对比表示学习。
Neural Netw. 2024 Apr;172:106083. doi: 10.1016/j.neunet.2023.12.037. Epub 2023 Dec 27.
7
A Topology-Enhanced Multi-Viewed Contrastive Approach for Molecular Graph Representation Learning and Classification.一种用于分子图表示学习和分类的拓扑增强多视图对比方法。
Mol Inform. 2025 Jan;44(1):e202400252. doi: 10.1002/minf.202400252.
8
Deep semi-supervised learning via dynamic anchor graph embedding in latent space.基于潜在空间动态锚图嵌入的深度半监督学习。
Neural Netw. 2022 Feb;146:350-360. doi: 10.1016/j.neunet.2021.11.026. Epub 2021 Dec 1.
9
Cost-Sensitive Weighted Contrastive Learning Based on Graph Convolutional Networks for Imbalanced Alzheimer's Disease Staging.基于图卷积网络的代价敏感加权对比学习在不平衡阿尔茨海默病分期中的应用。
IEEE Trans Med Imaging. 2024 Sep;43(9):3126-3136. doi: 10.1109/TMI.2024.3389747. Epub 2024 Sep 3.
10
Adaptive self-supervised learning for sequential recommendation.自适应自监督学习在序列推荐中的应用。
Neural Netw. 2024 Nov;179:106570. doi: 10.1016/j.neunet.2024.106570. Epub 2024 Jul 24.