Research Article
Word Sequential Using Deep LSTM and Matrix Factorization to Handle Rating Sparse Data for E-Commerce Recommender System
Table 6
Performance comparison of LSTM-PMF with the state-of-the-art methods on ML-1M.
| Sparseness level (high-low) | RMSE evaluation result | Comparison result | PMF | CNN-PMF | LSTM-PMF | PMF versus LSTM-PMF (%) | CNN-PMF versus LSTM-PMF (%) |
| 10% (90% sparseness level) | 1.64697 | 0.99541 | 0.9928 | 39.00 | 0.26 | 20% (80% sparseness level) | 1.26577 | 0.9276 | 0.93214 | 26.70 | 0.48 | 30% (70% sparseness level) | 1.1118 | 0.90507 | 0.89993 | 18.59 | 0.56 | 40% (60% sparseness level) | 1.03992 | 0.88525 | 0.88458 | 14.87 | 0.08 | 50% (50% sparseness level) | 0.99064 | 0.87787 | 0.87114 | 11.38 | 0.76 | 60% (40% sparseness level) | 0.95897 | 0.86774 | 0.86157 | 9.50 | 0.71 | 70% (30% sparseness level) | 0.93369 | 0.86874 | 0.85471 | 6.95 | 1.61 | 80% (20% sparseness level) | 0.91134 | 0.85574 | 0.84745 | 6.10 | 0.96 | 90% (10% sparseness level) | 0.90452 | 0.84971 | 0.84079 | 6.06 | 1.04 | ā (total) | 139 | 6.46 | (average) | 15.4 | 0.71 |
|
|