Research Article
A Stacked BiLSTM Neural Network Based on Coattention Mechanism for Question Answering
Table 4
Experimental results of different baselines and our model on the Wiki-QA dataset.
| Idx | Model | MAP | MRR |
| 1 | LSTM with attention [40] | 0.6639 | 0.6828 | 2 | CNN-Cnt [41] | 0.6520 | 0.6086 | 3 | wGRU-sGRU-Gl2 [42] | 0.7537 | 0.7658 | 4 | wGRU-sGRU-Gl2-Cnt [42] | 0.7638 | 0.7825 | 5 | Stacked BiLSTM | 0.7248 | 0.7333 | 6 | SBiLSTM-coA (cosine + Euclidean) | 0.7643 | 0.7751 |
|
|