| | Model | Precision (%) | Recall (%) | F1-score (%) | Time (min) | Total params | Trainable params | Epoch | Size (MB) |
| | BERT + BI-GRU | 60 | 54 | 54 | 241 | 109,920,145 | 1,609,873 | 6 | 419 | | BERT-BI-LSTM | 61 | 53 | 54 | 380 | 109,247,003 | 936,731 | 11 | 417 | | BERT + CNN | 58 | 51 | 52 | 153 | 108,404,591 | 94,319 | 4 | 413 | | BERT–CNN–LSTM | 57 | 51 | 52 | 301 | 109,200,743 | 890,471 | 8 | 416 | | BERT-LSTM-CNN | 52 | 44 | 44 | 189 | 111,918,864 | 3,608,592 | 5 | 427 | | DistilBERT + BI-GRU | 59 | 54 | 54 | 85 | 66,800,785 | 1,609,873 | 5 | 261 | | DistilBERT BI-LSTM | 60 | 54 | 54 | 84 | 66,127,643 | 936,731 | 5 | 252 | | DistilBERT-CNN | 56 | 53 | 53 | 69 | 65,285,231 | 94,319 | 4 | 249 | | DistilBERT–CNN–LSTM | 59 | 50 | 51 | 132 | 66,081,383 | 890,471 | 7 | 252 | | DistilBERT-LSTM-CNN | 56 | 46 | 48 | 75 | 68,799,504 | 3,608,592 | 4 | 268 |
|
|