Research Article
A Cooperative Lightweight Translation Algorithm Combined with Sparse-ReLU
Table 6
The comparison results between this model and others (German-English translation task using IWSLT14 dataset).
| | BLEU | Size M | Model |
| Research [17] | 23.1 | — | Encoder model based on 6-layer CNN. | Research [28] | 28.83 | — | Tag-less backtranslation | Research [29] | 29.9 | — | Linear transformer | Research [16] | 34.4 | — | Random feature attention | Research [30] | 34.8 | 285 | Pay less attention with lightweight CNN | 35.20 | 296 | Pay less attention with dynamic CNN | Traditional transformer model [4] | 34.44 | 49.27 | Traditional transformer model | This paper scheme | Small model + Sparse-ReLU | 34.29 | 36.42 | Small transformer | 35.16 | Sparse-ReLU + Small transformer | Small model + Sparse-ReLU + CNN | 35.24 | 37.99 | Sparse-ReLU + small transformer + CNN |
|
|