PERLEX: A Bilingual Persian-English Gold Dataset for Relation Extraction
Table 1
A summary of models used for testing.
Method
Summary
CNN
Uses a convolutional neural network with 1-dimensional convolutional layers to encode each sentence to a relation representation
Att-BLSTM
Uses a single layer bidirectional RNN coupled with an attention mechanism to compute relation representations for sentences
BLSTM-LET
Takes advantage of the self-attention mechanism used in transformer neural networks on top of a single layer bidirectional RNN. Also models latent entity types to compute better relation representations.
R-BERT
Uses BERT for relation extraction by adding special tokens before and after entities. The hidden states corresponding to these tokens are then used to construct the final relation representation.
BERTEM-MTB
Uses BERT for the task of relation extraction by adding special tokens before and after entities. The hidden states corresponding to these tokens are then used to construct the final relation representation. Different from R-BERT in how the hidden states corresponding to the special tokens are used.