| Classification models | Parameter tunning |
| AdaBoost | n_estimators = 30, learning rate = 1, algorithm = SAMME.R, base estimator = decisionclassifier | Bagging | Base estimator = decisionclassifier, n_estimators = 100, bootstrap = true | Random forest | n_estimators = 100, max_depth = 16, bootstrap = true, min_samples_leaf = 0.1, min_samples_split = 2, min_leaf_nodes = 10 | KNN | n_neighbors = 10, , weights = “uniform,” algorithm = “auto”, leaf_size = 30 | Logistic regression | Stopping criteria-1e − 9, bias = true, maximum iteration for convergence = 100 | Naïve Bayes | Prior = default, smoothing = 1e − 9, class count = 2 | Support vector machine | Regularization = 1, kernel = rbf, kernel coefficient = gamma | Vote | Voting = “hard,” estimators = logistic regression, decision tree, | Artificial neural network | Activation function = “relu” for input layers, activation function = “sigmoid” for output layer, dropout = “0.1 ,” batch size = “10,” num_hidden_layers = “1,” loss = “binary_cross-entropy,” optimizer = “Adam,” epoch = “150” |
|
|