Ruan Franklin Y, Zhang Aiwei, Oh Jenny Y, Jin SouYoung, Jacobson Nicholas C
Center for Technology and Behavioral Health, Geisel School of Medicine, Dartmouth College, Lebanon, NH, United States.
Department of Computer Science, Dartmouth College, Hanover, NH, United States.
ArXiv. 2025 Jan 14:arXiv:2411.15240v3.
Pretrained foundation models and transformer architectures have driven the success of large language models (LLMs) and other modern AI breakthroughs. However, similar advancements in health data modeling remain limited due to the need for innovative adaptations. Wearable movement data offers a valuable avenue for exploration, as it's a core feature in nearly all commercial smartwatches, well established in clinical and mental health research, and the sequential nature of the data shares similarities to language. We introduce the Pretrained Actigraphy Transformer (PAT), the first open source foundation model designed for time-series wearable movement data. Leveraging transformer-based architectures and novel techniques, such as patch embeddings, and pretraining on data from 29,307 participants in a national U.S. sample, PAT achieves state-of-the-art performance in several mental health prediction tasks. PAT is also lightweight and easily interpretable, making it a robust tool for mental health research. GitHub: https://github.com/njacobsonlab/Pretrained-Actigraphy-Transformer/.
预训练基础模型和Transformer架构推动了大语言模型(LLMs)的成功以及其他现代人工智能的突破。然而,由于需要进行创新性调整,健康数据建模方面的类似进展仍然有限。可穿戴运动数据提供了一条有价值的探索途径,因为它是几乎所有商业智能手表的核心功能,在临床和心理健康研究中已得到充分确立,并且数据的序列性质与语言有相似之处。我们引入了预训练活动记录仪Transformer(PAT),这是第一个为时间序列可穿戴运动数据设计的开源基础模型。利用基于Transformer的架构和诸如补丁嵌入等新技术,并在美国全国样本中对29307名参与者的数据进行预训练,PAT在多项心理健康预测任务中取得了领先的性能。PAT还轻量级且易于解释,使其成为心理健康研究的强大工具。GitHub:https://github.com/njacobsonlab/Pretrained-Actigraphy-Transformer/