Research Article
Named Entity Recognition for Public Interest Litigation Based on a Deep Contextualized Pretraining Approach
| Parameters | Value |
| Vocabulary size | 21128 | BERT hidden size | 768 | BERT attention heads | 12 | BERT layers | 12 | BiLSTM hidden size | 384 | LSTM layers | 2 | Learning rate | 0.0001 | Pretraining word embedding size | 768 |
|
|