Research Article

Fractional Rectified Linear Unit Activation Function and Its Variants

Table 2

Performance comparison of neural network model in Case 1 with various activation functions.

CategoryFunctionTrainingTesting
MSEMSE

ConventionalReLU0.96480.02630.96820.0238
LReLU0.96110.02980.96210.0289
PReLU0.97230.02080.97440.0190
ELU0.97600.01830.97720.0179
SiLU0.96360.03120.96250.0291
GELU0.96030.03010.96200.0285

FractionalFReLU0.99010.00770.99290.0072
FLReLU0.99420.00440.99330.0046
FPReLU0.99330.00510.99310.0051
FELU0.92830.05810.96290.0301
FSiLU0.99310.00520.99420.0043
FGELU0.89020.13600.85640.1861