Research Article

PERLEX: A Bilingual Persian-English Gold Dataset for Relation Extraction

Table 1

A summary of models used for testing.

MethodSummary

CNNUses a convolutional neural network with 1-dimensional convolutional layers to encode each sentence to a relation representation
Att-BLSTMUses a single layer bidirectional RNN coupled with an attention mechanism to compute relation representations for sentences
BLSTM-LETTakes advantage of the self-attention mechanism used in transformer neural networks on top of a single layer bidirectional RNN. Also models latent entity types to compute better relation representations.
R-BERTUses BERT for relation extraction by adding special tokens before and after entities. The hidden states corresponding to these tokens are then used to construct the final relation representation.
BERTEM-MTBUses BERT for the task of relation extraction by adding special tokens before and after entities. The hidden states corresponding to these tokens are then used to construct the final relation representation. Different from R-BERT in how the hidden states corresponding to the special tokens are used.