A comparative study of long short-term memory based long-term electrical load forecasting techniques with hyperparameter optimization
Abstract
Long-term load forecasting (LTLF) is crucial for reliable electricity supply, infrastructure planning, and informed energy policies, ensuring grid stability and efficient resource allocation. Traditional methods, like statistical models and expert judgment, rely on historical data but may struggle with dynamic changes in technology, regulations, and consumer behavior. Addressing challenges such as economic uncertainties, seasonal variations, data quality, and integrating renewable energy requires advanced forecasting models and adaptive strategies. This research aims to develop an efficient LTLF model for the Coimbatore region in Tamil Nadu, India, using long short-term memory (LSTM) networks. While LSTM has limitations in capturing long- term dependencies and requires high data quality and complex management, optimizing hyperparameters, including through the opposition-based hunter- prey optimization (OHPO) technique, is explored to enhance its predictive performance. The results show that the proposed OHPO-configured LSTM model for LTLF achieves superior performance compared to other techniques, with a mean square error (MSE) of 0.25, root mean square error (RMSE) of 0.5 and mean absolute percentage error (MAPE) of 0.27. This research underscores the significance of improving LTLF precision for informed decision-making in infrastructure planning and energy policy formulation.
Keywords
Long short-term memory; Hyperparameter tuning; Opposition based learning; Hunter prey optimization; Long-term electrical load forecasting; Recurrent neural network
Full Text:
PDFDOI: http://doi.org/10.11591/ijece.v14i6.pp7080-7089
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).