Abstract
Electric vehicle (EV) charging behavior exhibits strong spatio-temporal randomness, often leading to transient peak loads and an elevated risk of distribution network overloads. In addition, existing prediction models still face challenges in achieving high accuracy, computational efficiency, and effective modeling of multi-level periodic patterns. To address these issues, this study proposes a novel architecture termed the Convolutional Sparse Periodic Transformer Network (CSPT-Net). At the front end of the architecture, the model incorporates a one-dimensional convolutional neural network (1D-CNN) to efficiently capture local temporal features. To improve computational efficiency, the traditional global attention mechanism is replaced with a sparse attention module. Furthermore, a customized periodic time-encoding module is designed to explicitly represent multi-scale temporal regularities such as daily, weekly, and holiday cycles. Extensive experiments on a large-scale dataset containing more than 70,000 real-world charging records demonstrate that CSPT-Net achieves state-of-the-art performance across all evaluation metrics. Specifically, CSPT-Net reduces the Mean Absolute Error (MAE) to 12.21 min and enhances training efficiency by over 58% compared with the standard Transformer baseline. These results confirm that CSPT-Net effectively balances predictive accuracy and computational efficiency while demonstrating superior robustness and generalization in complex real-world environments. Consequently, the proposed framework offers a reliable and high-performance data-driven foundation for power grid load management and charging infrastructure planning.
Affiliated Institutions
Related Publications
ResNeSt: Split-Attention Networks
The ability to learn richer network representations generally boosts the performance of deep learning models. To improve representation-learning in convolutional neural networks...
DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding
Recurrent neural nets (RNN) and convolutional neural nets (CNN) are widely used on NLP tasks to capture the long-term and local dependencies, respectively. Attention mechanisms ...
Single-Shot Refinement Neural Network for Object Detection
For object detection, the two-stage approach (e.g., Faster R-CNN) has been achieving the highest accuracy, whereas the one-stage approach (e.g., SSD) has the advantage of high e...
CrossViT: Cross-Attention Multi-Scale Vision Transformer for Image Classification
The recently developed vision transformer (ViT) has achieved promising results on image classification compared to convolutional neural networks. Inspired by this, in this paper...
Autoformer: Decomposition Transformers with Auto-Correlation for\n Long-Term Series Forecasting
Extending the forecasting time is a critical demand for real applications,\nsuch as extreme weather early warning and long-term energy consumption\nplanning. This paper studies ...
Publication Info
- Year
- 2025
- Type
- article
- Volume
- 15
- Issue
- 24
- Pages
- 12982-12982
- Citations
- 0
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.3390/app152412982