Chen Liwen, Liu Zhibin, Yu Qingquan, Jiang Xing, Zhang Huanghui, Lin Xin
Fujian University of Technology Institute of Ubiquitous Perception and Multi-sensor Integration Research, Fuzhou, Fujian Province, 350118, China.
Fuzhou Chengtou New Infrastructure Group Co., Ltd, Fuzhou, Fujian Province, 350011, China.
Heliyon. 2024 Aug 16;10(16):e35840. doi: 10.1016/j.heliyon.2024.e35840. eCollection 2024 Aug 30.
In order to address the issue of metering inaccuracies in charging stations that directly affect the development of electric vehicles, a prediction method for the relative error of charging stations based on the ConvFormer model is proposed. The model combines Convolutional Neural Networks (CNN) with Transformer models in parallel, significantly improving the prediction accuracy. First, charging station data is preprocessed using forward interpolation and normalization methods, and the dataset is transformed into a dataset of input relative errors. Then, a neural network with an improved unidirectional convolutional and attention combination for time-series forecasting is constructed, and common regression performance evaluation metrics, MAE (Mean Absolute Error) and MSE (Mean Squared Error), are selected for evaluation. Finally, based on seven days of charging station data, the relative error of charging stations for the next 24 h is predicted, and compared to traditional Transformer and LSTM (Long Short-Term Memory) time-series models. The results show that the improved model yields the lowest values for both MAE and MSE, with a 47.30 % reduction in MAE compared to the Transformer model and a 38.06 % reduction compared to LSTM, and a 66.94 % reduction in MSE compared to the Transformer model and approximately 62.32 % reduction compared to LSTM.
为了解决直接影响电动汽车发展的充电站计量不准确问题,提出了一种基于ConvFormer模型的充电站相对误差预测方法。该模型将卷积神经网络(CNN)与Transformer模型并行结合,显著提高了预测精度。首先,采用前向插值和归一化方法对充电站数据进行预处理,并将数据集转换为输入相对误差的数据集。然后,构建一种用于时间序列预测的改进型单向卷积与注意力相结合的神经网络,并选择常用的回归性能评估指标平均绝对误差(MAE)和均方误差(MSE)进行评估。最后,基于七天的充电站数据,预测未来24小时充电站的相对误差,并与传统的Transformer和长短期记忆(LSTM)时间序列模型进行比较。结果表明,改进后的模型在MAE和MSE方面均取得了最低值,与Transformer模型相比,MAE降低了47.30%,与LSTM相比降低了38.06%;与Transformer模型相比MSE降低了66.94%,与LSTM相比降低了约62.32%。