| Variables | HPs | Range |
| y1 | Number of first convolution kernels | [1, 300] | y2 | First convolution layer kernel size | {33, 55, 77, 99} | y3 | First convolution layer activation function | {Sigmoid, ReLu, Tanh} | y4 | First pooling layer types | {Max-pooling, min-pooling, average-pooling} | y5 | Number of second convolution kernels | [1, 300] | y6 | The 2nd convolution layer kernel size | {33, 55, 77, 99} | y7 | The 2nd convolution layer activation function | {ReLu, Sigmoid, Tanh} | y8 | The 2nd pooling layer | {max-pooling, min-pooling, average-pooling} | y9 | The first FC layer neuron | [10, 800] | y10 | Activation function of the first FC layer | {ReLu, Sigmoid, Tanh} | y11 | First FC layer dropout | [0.1, 0.9] | y12 | Number of second FC layer neurons | [10, 800] | y13 | The 2nd FC layer activation function | {Sigmoid, ReLu, Tanh} | y14 | The 2nd FC layer dropout | [0.1, 0.8] | y15 | Learning rate | [0.01, 0.001, 1] |
|
|