Research Article
A Novel Data-Driven Method for Medium-Term Power Consumption Forecasting Based on Transformer-LightGBM
Table 1
The transformer-lightGBM parameter settings.
| Model | Parameters | Value |
| Transformer | num_layers | 2 | d_model | 256 | num_heads | 2 | dff | 256 | input_vocab_size | 8500 | maximum_position_encoding | 6 | optimizer | Adam | learning_rate | 0.0001 | loss | mae | metrics | mse |
| LightGBM | GridSearchCV (optimize the parameters method) | Estimator | objective | Regression | boosting_type | gbdt | n_estimators | 81 | metric | rmse | boosting_type | gbdt | objective | Regression | learning_rate | 0.3 | num_leaves | 50 | max_depth | 17 | subsample | 0.8 | colsample_bytree | 0.8 | max_depth | range (10, 30, 5) | num_leaves | range (50, 170, 30) | learning_rate | [0.3, 0.25, 0.2, 0.15, 0.1, 0.05, 0.01] | feature_fraction | [0.5, 0.6, 0.7, 0.8, 0.9] | bagging_fraction | [0.6, 0.7, 0.8, 0.9, 1.0] | subsample | 1 | min_samples_split | 2 | min_samples_leaf | 1 | num_leaves | 110 | max_depth | 10 | learning_rate | 0.1 | feature_fraction | 0.8 | bagging_fraction | 0.8 | bagging_freq | 10 | num_boost_round | 531 | early_stopping_rounds | 200 |
|
|