Research Article
Fractional Rectified Linear Unit Activation Function and Its Variants
Table 3
Performance comparison of neural network model in Case
2 with various activation functions.
| Category | Function | Training | Testing | | MSE | | MSE |
| Conventional | ReLU | 0.7666 | 0.3566 | 0.7938 | 0.3238 | LReLU | 0.9044 | 0.1389 | 0.9077 | 0.1359 | PReLU | 0.8327 | 1.1748 | 0.8145 | 0.3338 | ELU | 0.7452 | 0.7686 | 0.6404 | 0.7915 | SiLU | 0.8534 | 3.1758 | 0.8519 | 3.2667 | GELU | 0.8110 | 3.4592 | 0.8089 | 3.6084 |
| Fractional | FReLU | 0.7798 | 0.3589 | 0.8145 | 0.3380 | FLReLU | 0.9246 | 5.0276 | 0.9117 | 4.6879 | FPReLU | 0.8746 | 1.1998 | 0.8821 | 1.0987 | FELU | 0.8509 | 2.5395 | 0.9311 | 2.4631 | FSiLU | 0.7824 | 0.6998 | 0.7714 | 0.7121 | FGELU | 0.7243 | 7.5412 | 0.7417 | 7.2491 |
|
|