Research Article

A Stacked BiLSTM Neural Network Based on Coattention Mechanism for Question Answering

Table 4

Experimental results of different baselines and our model on the Wiki-QA dataset.

IdxModelMAPMRR

1LSTM with attention [40]0.66390.6828
2CNN-Cnt [41]0.65200.6086
3wGRU-sGRU-Gl2 [42]0.75370.7658
4wGRU-sGRU-Gl2-Cnt [42]0.76380.7825
5Stacked BiLSTM0.72480.7333
6SBiLSTM-coA (cosine + Euclidean)0.76430.7751