Suppr超能文献

TGEL-Transformer:将教育理论与深度学习相融合以实现可解释的学生成绩预测。

TGEL-transformer: Fusing educational theories with deep learning for interpretable student performance prediction.

作者信息

Gong Yuhao, Wang Fei, Zhang Yuchen, Geng Jiaqi

机构信息

Nanchang Hangkong University, Nanchang, Jiangxi, China.

Nanchang Institute of Science and Technology, Nanchang, Jiangxi, China.

出版信息

PLoS One. 2025 Jun 30;20(6):e0327481. doi: 10.1371/journal.pone.0327481. eCollection 2025.

Abstract

With the integration of educational technology and artificial intelligence, personalized learning has become increasingly important. However, traditional educational data mining methods struggle to effectively integrate heterogeneous feature data and represent complex learning interaction processes, while existing deep learning models lack educational theory guidance, resulting in insufficient interpretability. To address these challenges, this study proposes the TGEL-Transformer (Theory-Guided Educational Learning Transformer) framework, which integrates multiple intelligence theory and social cognitive theory, featuring three innovations: a dual-channel feature processing module that integrates cognitive, affective, and environmental dimension features; a theory-guided four-head attention mechanism that models educational interaction dynamics; and an interpretable prediction layer that provides theoretical support for educational interventions. Using a dataset of 6,608 students, TGEL-Transformer achieved RMSE = 1.87 and R2 = 0.75, outperforming existing methods with statistically significant improvements (p < 0.001) ranging from 1.1% against recent state-of-the-art models to 5.6% against transformer baselines. External validation on cross-cultural data (n = 480) demonstrated strong generalizability with R2 = 0.683. Attention weight analysis revealed that teacher support (0.15), prior knowledge (0.15), and peer interaction (0.13) are key factors influencing learning outcomes. This study provides a theory-guided framework for educational data mining, offering data-driven support for personalized education and advancing intelligent education development.

摘要

随着教育技术与人工智能的融合,个性化学习变得越来越重要。然而,传统的教育数据挖掘方法难以有效整合异构特征数据并表征复杂的学习交互过程,而现有的深度学习模型缺乏教育理论指导,导致可解释性不足。为应对这些挑战,本研究提出了TGEL-Transformer(理论引导的教育学习Transformer)框架,该框架整合了多元智能理论和社会认知理论,具有三项创新:一个整合认知、情感和环境维度特征的双通道特征处理模块;一个对教育交互动态进行建模的理论引导四头注意力机制;以及一个为教育干预提供理论支持的可解释预测层。使用一个包含6608名学生的数据集,TGEL-Transformer的RMSE = 1.87,R2 = 0.75,优于现有方法,与最近的最先进模型相比有1.1%的显著统计改进(p < 0.001),与Transformer基线相比有5.6%的显著统计改进。对跨文化数据(n = 480)的外部验证表明其具有很强的泛化性,R2 = 0.683。注意力权重分析表明,教师支持(0.15)、先验知识(0.15)和同伴互动(0.13)是影响学习成果的关键因素。本研究为教育数据挖掘提供了一个理论引导框架,为个性化教育提供了数据驱动的支持,并推动了智能教育的发展。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验