• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

BiFormer:一种用于大规模图表示学习的二分信息流融合框架。

BiFormer: A Bipartite-stream Information Fusion framework for large-scale graph representation learning.

作者信息

Zhang Qi, Sun Yanfeng, Wang Shaofan, Gao Junbin, Yin Baocai

机构信息

Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing Institute of Artificial Intelligence, School of Information Science and Technology, Beijing University of Technology, Beijing 100124, China; School of Information Science and Engineering, Shandong Agricultural University, Taian 271018, China.

Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing Institute of Artificial Intelligence, School of Information Science and Technology, Beijing University of Technology, Beijing 100124, China.

出版信息

Neural Netw. 2025 Nov;191:107792. doi: 10.1016/j.neunet.2025.107792. Epub 2025 Jul 5.

DOI:10.1016/j.neunet.2025.107792
PMID:40639152
Abstract

Graph Neural Networks (GNNs) and Graph Transformers (GTs) have shown considerable success in graph-based tasks, each offering distinct strengths: GNNs excel at capturing local details, while GTs are adept at capturing global information. However, both GNNs and GTs face scalability issues when applied to large-scale graphs. To address these challenges, this paper proposes the Graph Transformer Based on Bipartite-stream Information Fusion (BiFormer), a framework designed to integrate the benefits of GTs and GNNs for processing large-scale graphs. BiFormer consists of three modules: a global feature extraction module, which utilizes a Transformer encoder to efficiently capture global information from a small-scale pooled graph; and a local feature extraction module that constructs three parameter-free graph convolution kernels for extracting local features without training; a feature fusion module, which employs a Transformer encoder to fuse extracted local and global features of each node without node-to-node message passing. The complete training of BiFormer requires only the small-scale pooled graph and mini-batched local features to be stored temporarily in memory, allowing for mini-batch training with flexible batch size. Experimental results demonstrate that BiFormer outperforms mainstream GTs and GNNs.

摘要

图神经网络(GNN)和图变换器(GT)在基于图的任务中已取得显著成功,各自具有独特优势:GNN擅长捕捉局部细节,而GT则善于捕捉全局信息。然而,GNN和GT在应用于大规模图时都面临可扩展性问题。为应对这些挑战,本文提出了基于二分信息流融合的图变换器(BiFormer),这是一个旨在融合GT和GNN的优势来处理大规模图的框架。BiFormer由三个模块组成:一个全局特征提取模块,它利用变换器编码器从小规模池化图中高效捕捉全局信息;一个局部特征提取模块,它构建三个无参数的图卷积核来提取局部特征而无需训练;一个特征融合模块,它利用变换器编码器融合每个节点提取的局部和全局特征,无需节点到节点的消息传递。BiFormer的完整训练仅需将小规模池化图和小批量局部特征临时存储在内存中,从而允许使用灵活批量大小进行小批量训练。实验结果表明,BiFormer优于主流的GT和GNN。

相似文献

1
BiFormer: A Bipartite-stream Information Fusion framework for large-scale graph representation learning.BiFormer:一种用于大规模图表示学习的二分信息流融合框架。
Neural Netw. 2025 Nov;191:107792. doi: 10.1016/j.neunet.2025.107792. Epub 2025 Jul 5.
2
Graph representation learning via enhanced GNNs and transformers.通过增强型图神经网络(GNN)和变换器进行图表示学习。
Sci Rep. 2025 Aug 6;15(1):28758. doi: 10.1038/s41598-025-08688-7.
3
Distilling knowledge from graph neural networks trained on cell graphs to non-neural student models.从在细胞图上训练的图神经网络中提取知识,用于非神经学生模型。
Sci Rep. 2025 Aug 10;15(1):29274. doi: 10.1038/s41598-025-13697-7.
4
SFPGCL: Specificity-preserving federated population graph contrastive learning for multi-site ASD identification using rs-fMRI data.SFPGCL:使用静息态功能磁共振成像数据进行多站点自闭症谱系障碍识别的特异性保持联邦群体图对比学习
Comput Med Imaging Graph. 2025 Sep;124:102558. doi: 10.1016/j.compmedimag.2025.102558. Epub 2025 May 16.
5
Edges are all you need: Potential of medical time series analysis on complete blood count data with graph neural networks.边缘就是你所需要的一切:利用图神经网络对全血细胞计数数据进行医学时间序列分析的潜力。
PLoS One. 2025 Jul 8;20(7):e0327636. doi: 10.1371/journal.pone.0327636. eCollection 2025.
6
TLTNet: A novel transscale cascade layered transformer network for enhanced retinal blood vessel segmentation.TLTNet:一种新颖的跨尺度级联分层Transformer 网络,用于增强视网膜血管分割。
Comput Biol Med. 2024 Aug;178:108773. doi: 10.1016/j.compbiomed.2024.108773. Epub 2024 Jun 25.
7
Prescription of Controlled Substances: Benefits and Risks管制药品的处方:益处与风险
8
Long-term care plan recommendation for older adults with disabilities: a bipartite graph transformer and self-supervised approach.针对残疾老年人的长期护理计划建议:一种二分图变压器和自监督方法。
J Am Med Inform Assoc. 2025 Apr 1;32(4):689-701. doi: 10.1093/jamia/ocae327.
9
Anti-Symmetric Molecular Graph Learning Approach With Residual Adaptive Network Based Fuzzy Inference System for Lethal Dose Forecasting Problem.基于残差自适应网络模糊推理系统的反对称分子图学习方法用于致死剂量预测问题
J Comput Chem. 2025 Jul 15;46(19):e70176. doi: 10.1002/jcc.70176.
10
Accelerated prediction of molecular properties for per- and polyfluoroalkyl substances using graph neural networks with adjacency-free message passing.使用无邻接消息传递的图神经网络对全氟和多氟烷基物质的分子性质进行加速预测。
Environ Pollut. 2025 Jun 30;382:126705. doi: 10.1016/j.envpol.2025.126705.