Research Article

[Retracted] Taxonomy of Adaptive Neuro-Fuzzy Inference System in Modern Engineering Sciences

Table 4

Proposed hybrids of ANFIS implemented models.

ModelModel hybrids

ABC [57]aABC [43, 58], adaptive ABC (AABC) [59], vortex search [60], cooperative ABC (CABC) [61, 62], cooperative micro-ABC (CMABC) [63], interval cooperative multiobjective ABC (ICMOABC) [62], ABC-PSO [64], multiobjective directed bee colony optimization (MODBCO) [65], Scoutless ABC [35], directed ABC [66, 67]
ACO [68]ACOR [36], heuristic-PS-ACO (HPSACO) [69], hybrid ACO [70], ACO-PSO [71], PS-ACO [72], ACO-SA [73], MWIS-ACO-LS [74], hybrid ACO (HAntCO) [75], min-max ant System (MMAS) [72, 76], GA-ACO-SA [77], self-adaptive ant colony-genetic hybrid [78], GA-ACO [79], ACS [80], greedy ACS [81]
BA [82]Binary BA [83], hybrid BA with ABC [84], BA-HS [85], adaptive BA [86], adaptive multiswarm BA (AMBA) [87], binary BA [83], differential operator & Levy flights BA [87], directed artificial BA (DABA) [88], double-subpopulation Levy flight BA (DLBA) [89], dynamic virtual BA (DVBA) [90], improved DVBA with probabilistic selection [91], island multipopulational parallel BA (IBA) [92], modified BA (stability analysis) [93], multiobjective BA (MOBA) [94], novel BA with multiple strategies coupling (mixBA) [95], OBMLBA [96], shrink factor BA (SBA) [92], simplified adaptive BA based on frequency [97]
DE [98]DE with modified PSO (DEMPSO) [99], DEPSO [100], DE-GA [101], DE with K-means clustering [102], DE-GWO [103], DE with adaptive mutation (DEAM) [104], simplified real-coded differential GA (SADE) [105], DEACS [40]
FFA [56]Hybrid firefly with PSO (HFPSO) [106], modified FFO (MFO) [107], FA-HS [108]
GA [109]HGA with local search [110], adaptive HGA (a-HGA) [111], GSA-GA [112], GA/SA [113], GA/SA/TS [114], GA-PSO [115]
HS [116]GHS [117], HS-teaching-learning-based optimization (HSTLBO) [118], HS-SA [119], mutation-based HS (MBHS) [120], GWO-HS [121], hybrid Taguchi-HS [122], HS-BA [85]
PSO [55]APAPSO [51], PSO-LMS [123], QPSO [124], IQPSO [125], PSO-SA [126], PSO-BFO [123], GA-PSO [127], PSO-FLC [128], enhanced PSO [127], DEMPSO [99], DEPSO [100], PSO-local search [129]
SA [130]Integer augmented SA (IASA) [131], real-coded augmented SA (RASA) [131], real-coded SA (RCSA) [52]
SC [132]SC-FCM (subtractive clustering-fuzzy C-means) [133], FCM-ELPSO [134], firefly-based FCM (FFCM) [135]

Optimization algorithms form the backbone of Artificial Neural Networks (ANNs). They help correctly update the weights of the network neurons, so that the prediction improves. Backpropagation is a widely popular and simple optimization algorithm. It is utilized heavily in popular Deep Learning frameworks PyTorch and TensorFlow. Limitations of backpropagation provided the research community to develop several alternatives, such as ACO, PSO, GA, and BAT.