Research Article
KoRASA: Pipeline Optimization for Open-Source Korean Natural Language Understanding Framework Based on Deep Learning
Table 2
Parameter comparison between DIET-Base and DIET-Opt (KoRASA).
| | Parameter | DIET-Base | DIET-Opt (KoRASA) |
| | Epoch | 300 | 500 | | The number of transformer layers | 2 | 4 | | Transformer size | 256 | 256 | | Masked language model | True | True | | Drop rate | 0.25 | 0.25 | | Weight sparsity | 0.8 | 0.7 | | Embedding dimension | 20 | 30 | | Hidden layer size | (256, 128) | (512, 128) |
|
|