|
Authors | Algorithm | Data set | Performance evaluation | Pros | Cons |
|
Somu et al. [14] | (i) ARIMA (ii) Genetic algorithm-LSTM (iii) Sine cosine optimization algorithm-LSTM | (i) The KReSIT power consumption data set is sourced from the Indian Institute of Technology (IIT) in Mumbai, India. | (i) ARIMA: MAE: 0.3479; MAPE: 21.3333; MSE: 0.1661; RMSE: 0.4076. (ii) Genetic algorithm-LSTM: MAE: 0.1804; MAPE: 5.9745; MSE: 0.0432; RMSE: 0.2073. (iii) (ISCOA-LSTM): MAE: 0.0819; MAPE: 4.9688; MSE: 0.0135; RMSE: 0.1164. | (i) Improved forecasting accuracy (ii) Improved forecasting accuracy (iii) Real-world applicability | (i) Sensitivity to initialization (ii) Convergence speed |
Suranata et al. [15] | (i) Long short-term memory | (i) NL | (i) ; ; | (i) The ability to effectively predict energy consumption patterns in time series data. | (i) Time-consuming training |
Ferdoush et al. [21] | (i) LSTM (ii) RF-bi-LSTM hybrid model (iii) Bidirectional long short-term memory (LSTM) | (i) Bangladesh Power Development Board covered 36 months. | (i) LSTM: ; ; ; . (ii) Bi-LSTM: ; ; ; . (iii) RF-bi-LSTM: ; ; . | (i) Stable learning characteristics. (ii) Moderate generalization gap in learning loss analysis. | (i) The hybrid model may require specific data to utilize the strengths of random forest and bidirectional LSTM effectively. |
Yaqing et al. [22] | (i) EMD-BO-LSTM (ii) iCEEMDAN | (i) Real power consumption data of a university campus for 12 months | (i) EMD-BO-LSTM: ; ; ; . (ii) iCEEMDAN-BO-LSTM: ; ; ; . | (i) Adaptability and efficiency (ii) Enhanced prediction accuracy | (i) NL |
Ndife et al. [26] | (i) ConvLSTM encoder-decoder | (i) Two million measurements were gathered over 47 months from a residential location in Sceaux, France. | (i) RMSE on the model: 358 kWh RMSE on the persistence model: 465 kWh RMSE on model A: 530 kWh RMSE on model B: 450.5 kwh | (i) Improved forecast accuracy (ii) Suitable for low-powered devices (iii) Efficient training and prediction time | (i) Model complexity |
Duong et al. [27] | (i) Multiple layer perceptron | (i) 215 data points on the power consumption and on/off status of electrical devices, in Vietnam. | (i) RMSE: 10.468 (ii) MAPE: 21.563 | (i) It handles large amounts of input data well. Makes quick predictions after training. | (i) Slow training |
Faiq et al. [17] | (i) LSTM (ii) LSTM-RNN (iii) CNN-LSTM | (i) Daily data from 2018 to 2021, from the Malaysian Meteorological Department. | (i) LSTM: ; . (ii) LSTM-RNN: ; . (iii) CNN-LSTM: , . | (i) Accurate prediction of building energy consumption (ii) Improved energy efficiency | (i) Requires a significant amount of historical data to create an accurate model |
Bhol et al. [29] | (i) ARIMA (ii) Holt-Winters flower pollination algorithm | (i) Laboratory-operated critical loads over three months. | (i) HW-GFPA: for validation, 0.43 for test (ii) ARIMA: for validation, 0.016 for test | (i) Scalability (ii) Optimal hyperparameter identification | (i) Sensitivity to kernel selection |
|