Research Article

Assessment of Jiangsu Regional Logistics Space Nonequilibrium Situation by Boosting and Bagging Algorithms

Table 1

Differences between Boosting algorithm and Bagging algorithm.

BoostingBagging

Sample selectionThe training set of each round is unchanged, but the weight of each sample in training set in the classifier changes. The weights are adjusted according to the results of the previous round of classification.The training set is selected with replacement, and each round of training sets selected from the original data set is independent.
Sample weightThe weight of the sample is continuously adjusted according to the error rate—the greater the error rate, the greater the weight of the sample.Use uniform sampling, with equal weights for each sample.
Prediction functionEach weak classifier has a corresponding weight, and a classifier with a small classification error will have a larger weight.All predictors have equal weights.
Parallel computingEach prediction function can only be generated sequentially because the latter model parameters need to be combined with the model results of the previous round.Individual predictors can be generated in parallel.