|Table of Contents|

[1] Tang Ningkai, Lu Jixiang, Chen Tianyu, et al. Transformer-based correction scheme for short-term bus load prediction in holidays [J]. Journal of Southeast University (English Edition), 2024, 40 (3): 304-312. [doi:10.3969/j.issn.1003-7985.2024.03.010]
Copy

Transformer-based correction scheme for short-term bus load prediction in holidays()
Share:

Journal of Southeast University (English Edition)[ISSN:1003-7985/CN:32-1325/N]

Volumn:
40
Issue:
2024 3
Page:
304-312
Research Field:
Automation
Publishing date:
2024-09-20

Info

Title:
Transformer-based correction scheme for short-term bus load prediction in holidays
Author(s):
Tang Ningkai1 2 Lu Jixiang1 2 Chen Tianyu2 Shu Jiao1 2 Chang Li2 Chen Tao3
1 State Key Laboratory of Technology and Equipment for Defense Against Power System Operational Risks, Nanjing 211106, China
2 NARI Group Corporation(State Grid Electric Power Research Institute), Nanjing 211106, China
3 School of Electrical Engineering, Southeast University, Nanjing 211189, China
Keywords:
short-term bus load prediction Transformer network holiday load pre-training model load clustering
PACS:
TP274.2
DOI:
10.3969/j.issn.1003-7985.2024.03.010
Abstract:
To tackle the problem of inaccurate short-term bus load prediction, especially during holidays, a Transformer-based scheme with tailored architectural enhancements is proposed. First, the input data are clustered to reduce complexity and capture inherent characteristics more effectively. Gated residual connections are then employed to selectively propagate salient features across layers, while an attention mechanism focuses on identifying prominent patterns in multivariate time-series data. Ultimately, a pre-trained structure is incorporated to reduce computational complexity. Experimental results based on extensive data show that the proposed scheme achieves improved prediction accuracy over comparative algorithms by at least 32.00% consistently across all buses evaluated, and the fitting effect of holiday load curves is outstanding. Meanwhile, the pre-trained structure drastically reduces the training time of the proposed algorithm by more than 65.75%. The proposed scheme can efficiently predict bus load results while enhancing robustness for holiday predictions, making it better adapted to real-world prediction scenarios.

References:

[1] Facchinetti T, Della Vedova M L. Real-time modeling for direct load control in cyber-physical power systems[J]. IEEE Transactions on Industrial Informatics, 2011, 7(4): 689-698. DOI: 10.1109/TII.2011.2166787.
[2] Tang N K, Mao S W, Wang Y, et al. Solar power generation forecasting with a LASSO-based approach[J].IEEE Internet of Things Journal, 2018, 5(2): 1090-1099. DOI: 10.1109/JIOT.2018.2812155.
[3] Zhang R, Liu P F, Wang Q. Estimation model of EPC based on long time series of nighttime light data[J]. Journal of Southeast University(Natural Science Edition), 2021, 51(6): 1094-1102. DOI:10.3969/j.issn.1001-0505.2021.06.023. (in Chinese)
[4] Cao Y, Zheng L, Chen Y F, et al. Identification method and control strategy for superheated steam temperature of thermal power unit based on PFNN[J]. Journal of Southeast University(Natural Science Edition), 2022, 53(3): 417-424. DOI:10.3969/j.issn.1001-0505.2022.03.001. (in Chinese)
[5] Luo J Z, Su C. Optimization of charging pricing strategy based on user behavior and time-of-use tariffs[J]. Journal of Southeast University(Natural Science Edition), 2021, 51(6): 1109-1116. DOI:10.3969/j.issn.1001-0505.2021.06.025. (in Chinese)
[6] Lu R Y, Guo X C, Li J C, et al. Tourist travel behavior in rural areas considering bus route preferences[J]. Journal of Southeast University(English Edition), 2023, 39(1): 49-61. DOI: 10.3969/j.issn.1003-7985.2023.01.006.
[7] Bao Q, Tan X, Qu Q K, et al. Prediction of electric vehicle charging demand based on user space-time activities and fuzzy decision-making[J]. Journal of Southeast University(Natural Science Edition), 2022, 52(6): 1209-1218. DOI:10.3969/j.issn.1001-0505.2022.06.022. (in Chinese)
[8] Rubasinghe O, Zhang X N, Chau T K, et al. A novel sequence to sequence data modelling based CNN-LSTM algorithm for three years ahead monthly peak load forecasting[J].IEEE Transactions on Power Systems, 2024, 39(1): 1932-1947. DOI: 10.1109/TPWRS.2023.3271325.
[9] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]// 31st Annual Conference on Neural Information Processing Systems(NIPS). Long Beach, CA, USA, 2017, 30: 6000-6010.
[10] He Z R, Shen Q F, Wu J X, et al. Transformer encoder-based multilevel representations with fusion feature input for speech emotion recognition[J]. Journal of Southeast University(English Edition), 2023, 39(1): 68-73. DOI: 10.3969/j.issn.1003-7985.2023.01.008.
[11] Wen Q S, Zhou T, Zhang C L, et al. Transformers in time series: A survey[EB/OL].(2022-02-15)[2024-05-08]. http://arxiv.org/abs/2202.07125.
[12] Zhou H Y, Zhang S H, Peng J Q, et al.Informer: Beyond efficient transformer for long sequence time-series forecasting[EB/OL].(2020-12-14)[2024-05-08]. http://arxiv.org/abs/2012.07436.
[13] Wu H X, Xu J H, Wang J M, et al. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting[EB/OL].(2021-06-24)[2024-05-08]. http://arxiv.org/abs/2106.13008.
[14] Zhou T, Ma Z Q, Wen Q S, et al. FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting[EB/OL].(2022-01-30)[2024-05-08]. https://arxiv.org/abs/2201.12740.
[15] Lim B, Arık S Ö, Loeff N, et al. Temporal Fusion Transformers for interpretable multi-horizon time series forecasting[J].International Journal of Forecasting, 2021, 37(4): 1748-1764. DOI: 10.1016/j.ijforecast.2021.03.012.
[16] López Santos M, García-Santiago X, Echevarría Camarero F, et al. Application of Temporal Fusion Transformer for day-ahead PV power forecasting[J].Energies, 2022, 15(14): 5232. DOI: 10.3390/en15145232.
[17] Sun S L, Liu Y K, Li Q, et al. Short-term multi-step wind power forecasting based on spatio-temporal correlations and transformer neural networks[J].Energy Conversion and Management, 2023, 283: 116916. DOI: 10.1016/j.enconman.2023.116916.
[18] L’Heureux A, Grolinger K, Capretz M A M. Transformer-based model for electrical load forecasting[J].Energies, 2022, 15(14): 4993. DOI: 10.3390/en15144993.
[19] Zhao Z Z, Xia C Q, Chi L, et al. Short-term load forecasting based on the transformer model[J].Information, 2021, 12(12): 516. DOI: 10.3390/info12120516.
[20] Fu M Z, Qin M, Guo X J, et al. Magnetic field and coupling effect analysis of a novel dual-rotor dual-stator permanent magnet synchronous generator[J]. Journal of Southeast University(English Edition), 2024, 40(1): 89-96. DOI:10.3969/j.issn.1003-7985.2024.01.010.
[21] Chen K J, Chen K L, Wang Q, et al. Short-term load forecasting with deep residual networks[J].IEEE Transactions on Smart Grid, 2019, 10(4): 3943-3952. DOI: 10.1109/TSG.2018.2844307.
[22] Li Z H, Liu J M, Lin Y Z, et al. Grid-constrained data cleansing method for enhanced bus load forecasting[J].IEEE Transactions on Instrumentation and Measurement, 2021, 70: 9002810. DOI: 10.1109/TIM.2021.3075538.
[23] Rafiei M, Niknam T, Aghaei J, et al. Probabilistic load forecasting using an improved wavelet neural network trained by generalized extreme learning machine[J].IEEE Transactions on Smart Grid, 2018, 9(6): 6961-6971. DOI: 10.1109/TSG.2018.2807845.
[24] Akiba T, Sano S, Yanase T, et al. Optuna: A next-generation hyperparameter optimization framework[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. Anchorage, AK, USA, 2019: 2623-2631. DOI: 10.1145/3292500.3330701.

Memo

Memo:
Biographies: Tang Ningkai(1987—), male, doctor, tangningkai@sgepri.sgcc.com.cn; Chen Tao(corresponding author), male, doctor, associate professor, taoc@seu.edu.cn.
Foundation item: State Grid Science & Technology Program(No. 1400-202140341A-0-0-00).
Citation: Tang Ningkai, Lu Jixiang, Chen Tianyu, et al. Transformer-based correction scheme for short-term bus load prediction in holidays[J].Journal of Southeast University(English Edition), 2024, 40(3):304-312.DOI:10.3969/j.issn.1003-7985.2024.03.010.
Last Update: 2024-09-20