|
Author | Approaches | Advantages | Disadvantages |
|
S. Zhang et al. | Sentiment multiclassification | Accuracy was higher than the other model | The model was found to have a high cost |
L Dey et al. | Naive Bayes | Easy computation and better accuracy than KNN | Similar precision was observed in KNN and Naive Bayes for hotel review sets |
M. R. Huq et al. | Support classification algorithm (SCA) | The accuracy of the model increases by normalizing the dataset | The model performs poorly for larger datasets |
B. S. Lakshmi et al. | CNN | The models showed good results on both smaller and larger datasets | Better results are observed by combining the attention method |
Y. Fang et al. | Enhanced NB, enhanced SVM | Feature values and sentiment values are combined | Only slightly higher accuracy than support vector machine and Naive Bayes |
B. Shin et al. | CNN, attention | Attention mechanism helps reduce noise | The model does not consider multiple words |
G. Preethi et al. | Naive Bayes and recursive neural network | Boosted the accuracy of the sentiment analysis system | This model only considers small datasets |
A. S. Manek et al. | Feature selection using Gini index, support vector machine | The model works with both smaller and larger datasets | This method results in high cost |
C. Chen et al. | BiGRU | This method effectively captures sentimental relations | ā |
L. Zhou and X. Bian | BiGRU, attention | The accuracy is improved by using the attention mechanism | ā |
L. Yang et al. | SLCABG | This method combines the benefits of both CNN and BiGRU in one model | The method proves to be of high cost without any sentiment multiclassification |
|