Construction of Music Teaching Evaluation Model Based on Weighted Naïve Bayes
Table 1
Comparison and analysis of classification algorithms.
Algorithm
Advantage
Shortcoming
NB
Classification efficiency is stable and suitable for multiclassification tasks and incremental training and has the best performance when the correlation of feature attributes is small.
It cannot make a very accurate estimation of the class probability, and the time efficiency is low when there are many attributes or the correlation between attributes is large.
SVM
It uses less training set and can deal with high-dimensional sparse text data, which has a certain robustness.
It depends too much on the position of positive and negative examples around the classification surface, and the selection of kernel function lacks guidance. When there are many samples, the training speed is slow.
KNN
Without parameter estimation, it is easy to deal with the case of a large number of categories, and the method is simple and stable.
The sample size is large; the space complexity is high; the memory overhead is large; and the selection of K value also directly affects the performance of classification.
DT
The model has a tree structure, readability, and high classification speed.
It is easy to overfit and ignore the correlation between attributes.
BP network
It has the ability of self-adaptive, strong nonlinear mapping, and fault tolerance.
It has slow convergence, is easy to fall into local minimum, and has strong sample dependence.