|
Reference nos. | Methods | Datasets | Merits | Demerits |
|
[7] | CART decision tree for classifying BrCa using BI-RADS scores | Dataset from cancer center of Sun Yat-sen University | This model could recognize the benign class tumors in BI-RADS-3 to avoid unnecessary biopsy | This model relies more on the decisions of radiologists and clinicians for BI-RADS scores |
[8] | A CAD model using machine learning classifiers for BrCa detection | Dataset of breast ultrasound images | This model identifies and characterizes tumors at an early stage | The model’s performance could have been improved by implementing advanced deep learning models |
[9] | CAD system based on ensembled deep learning models | SNUH private dataset and BUSI dataset | An image fusion approach was applied in this model, which helps the deep learning models to provide better results | No preprocessing approaches were used |
[10] | Ensemble deep learning-based clinical decision-support system | Breast ultrasound dataset | The ResNet, DenseNet, and VGG models achieved better results | The detection rate could have been improved by using deep learning-based segmentation |
[11] | A CAD model based on YOLOv3 and Viola–Jones-based algorithm | Two datasets from private hospitals | YOLOv3-based performance was effective and reproducible compared to Viola–Jones’s performance | A limited volume of datasets is used. Unsupervised models could have been used for classification |
[12] | CAD model based on WNN and GWO | Breast ultrasound dataset | This GWO-WNN model had more robustness, smaller training data, and faster convergence | The classification accuracy could have been improved by using advanced algorithms |
[13] | Deep transfer learning model for automatic BrCa detection and classification | MIAS dataset | The transfer-learning models combined with CNN increased the performances | The results are obtained using a limited number of data samples, which could have been increased for proper validation |
[14] | Grad-CAM-based CNN model for BrCa classification | miniMIAS and CBIS-DDSM | The classification was performed based on with and without pectoral muscles, which helps identify dense regions | The performance could have been improved by performing the segmentation and feature extraction individually by different algorithms |
[15] | BrCa image segmentation and classification model using CNN | MIAS, DDSM, and CBIS-DDSM | Modified U-Net segmentation with data augmentation and CNN classification performed well | Feature extraction could have been performed separately for improved performance |
[16] | Multiscale CNN for BrCa classification | Private dataset | The multiscale dimensions of the backbone networks have resulted in more robust feature representations, thus leading to better performances | The results could have been performed better |
[17] | Multifractal dimension and feature fusion-based BrCa detection model | BCDR, DDSM, MIAS, and INbreast | This model can be used to identify subtypes of BrCa in mammography images | This model has yet to consider some critical information that might be useful in identifying the risk of tumors |
[18] | CAD system based on transferable texture CNN model | MIAS, DDSM, and INbreast | This deep learning model could be easily trained to achieve high accuracy in various BrCa images | However, three datasets were used, but the performance was evaluated on each dataset separately with limited samples |
[19] | BrCa classification model using probability-based optimal DL feature fusion | Breast ultrasound images dataset (BUSI) | The best feature selection eliminated the unnecessary features, and the fusion model minimized the computational time and increased the performance | A limited number of data samples are used |
[20] | U-Net study with ICA and deep features fusion for breast cancer identification | BUSI | The feature regularization used in this model overcame the overfitting issue and enhanced the performance | This model has used limited data; hence the conditional generative adversarial network could have been utilized for generating more data |
|