Research Article
Word Sequential Using Deep LSTM and Matrix Factorization to Handle Rating Sparse Data for E-Commerce Recommender System
Table 7
Performance comparison of LSTM-PMF over the state-of-the-art methods on ML-10M.
| Sparseness level (high-low) | RMSE evaluation result | Comparison result | PMF | CNN-PMF | LSTM-PMF | PMF versus LSTM-PMF (%) | CNN-PMF versus LSTM-PMF (%) |
| 10% (90% sparseness level) | 1.27539 | 0.93629 | 0.95506 | 25.10 | ā2 | 20% (80% sparseness level) | 1.05233 | 0.89332 | 0.89117 | 15.30 | 0.24 | 30% (70% sparseness level) | 0.96513 | 0.86621 | 0.85185 | 11.70 | 1.65 | 40% (60% sparseness level) | 0.91827 | 0.84673 | 0.82737 | 9.89 | 2.28 | 50% (50% sparseness level) | 0.88834 | 0.83604 | 0.81567 | 8.18 | 2.43 | 60% (40% sparseness level) | 0.86673 | 0.82794 | 0.80968 | 6.58 | 2.20 | 70% (30% sparseness level) | 0.85071 | 0.82054 | 0.80276 | 5.63 | 2.16 | 80% (20% sparseness level) | 0.84049 | 0.81276 | 0.79735 | 5.13 | 1.89 | 90% (10% sparseness level) | 0.82796 | 0.80505 | 0.7902 | 4.56 | 1.84 | ā (total) | 92 | 13 | (average) | 10.23 | 1.44 |
|
|