Research Article
PERLEX: A Bilingual Persian-English Gold Dataset for Relation Extraction
Table 7
Hyperparameters for training R-BERT and BERTEM-MTB models.
| | Hyperparameter | Value |
| | BERT pretrained weights | Multilingual BERT-Base | | Sentence length | 128 | | Batch size | 16 | | Optimizer | Adam | | Learning rate | 3e − 5 |
|
|