Yi Siyuan, Chen Xing, Tang Chuanming
Chengdu University of Technology, Chengdu, 610059, Sichuan, China.
Key Laboratory of Optical Engineering, Institute of Optics and Electronics, Chinese Academy of Sciences, Chengdu, 610209, Sichuan, China.
Sci Rep. 2025 Aug 12;15(1):29565. doi: 10.1038/s41598-025-15286-0.
AI-based methods have been widely adopted in tourism demand forecasting. However, current AI-based methods are weak in capturing long-term dependency, and most of them lack interpretability. This study proposes a time series Transformer (Tsformer) with Encoder-Decoder architecture for tourism demand forecasting. The Tsformer encodes long-term dependencies with the encoder, merges the calendar of data points in the forecast horizon, and captures short-term dependencies with the decoder. Experiments on two datasets demonstrate that Tsformer outperforms nine baseline methods in short-term and long-term forecasting before and after the COVID-19 outbreak. Further ablation studies confirm that the adoption of the calendar of data points in the forecast horizon benefits the forecasting performance. Our study provides an alternative method for more accurate and interpretable tourism demand forecasting.
基于人工智能的方法已在旅游需求预测中得到广泛应用。然而,当前基于人工智能的方法在捕捉长期依赖性方面较为薄弱,并且大多数方法缺乏可解释性。本研究提出了一种具有编码器-解码器架构的时间序列Transformer(Tsformer)用于旅游需求预测。Tsformer通过编码器对长期依赖性进行编码,合并预测范围内数据点的日历信息,并通过解码器捕捉短期依赖性。在两个数据集上进行的实验表明,Tsformer在新冠疫情爆发前后的短期和长期预测中均优于九种基线方法。进一步的消融研究证实,在预测范围内采用数据点的日历信息有利于提高预测性能。我们的研究为更准确且可解释的旅游需求预测提供了一种替代方法。