Research Article

Fractional Rectified Linear Unit Activation Function and Its Variants

Table 3

Performance comparison of neural network model in Case 2 with various activation functions.

CategoryFunctionTrainingTesting
MSEMSE

ConventionalReLU0.76660.35660.79380.3238
LReLU0.90440.13890.90770.1359
PReLU0.83271.17480.81450.3338
ELU0.74520.76860.64040.7915
SiLU0.85343.17580.85193.2667
GELU0.81103.45920.80893.6084

FractionalFReLU0.77980.35890.81450.3380
FLReLU0.92465.02760.91174.6879
FPReLU0.87461.19980.88211.0987
FELU0.85092.53950.93112.4631
FSiLU0.78240.69980.77140.7121
FGELU0.72437.54120.74177.2491