Research Article
Fractional Rectified Linear Unit Activation Function and Its Variants
Table 2
Performance comparison of neural network model in Case
1 with various activation functions.
| Category | Function | Training | Testing | | MSE | | MSE |
| Conventional | ReLU | 0.9648 | 0.0263 | 0.9682 | 0.0238 | LReLU | 0.9611 | 0.0298 | 0.9621 | 0.0289 | PReLU | 0.9723 | 0.0208 | 0.9744 | 0.0190 | ELU | 0.9760 | 0.0183 | 0.9772 | 0.0179 | SiLU | 0.9636 | 0.0312 | 0.9625 | 0.0291 | GELU | 0.9603 | 0.0301 | 0.9620 | 0.0285 |
| Fractional | FReLU | 0.9901 | 0.0077 | 0.9929 | 0.0072 | FLReLU | 0.9942 | 0.0044 | 0.9933 | 0.0046 | FPReLU | 0.9933 | 0.0051 | 0.9931 | 0.0051 | FELU | 0.9283 | 0.0581 | 0.9629 | 0.0301 | FSiLU | 0.9931 | 0.0052 | 0.9942 | 0.0043 | FGELU | 0.8902 | 0.1360 | 0.8564 | 0.1861 |
|
|