| Year | Technique | Anomalies addressed | Dataset | Data source | Source tool | Flow properties for anomaly detection | Comparison | Validation metrics (%) | Conclusion |
| 2016 [21] | Generalized entropy | DDoS, probe | Real: KDDcup99, NSL-KDD, UCI machine learning repository datasets Simulated: Testbed dataset (TUIDS) for DDoS and probe attacks | IP packet/IP flow | Netflow data | Dynamic selection of features through mutual information and GE | LOF for τ = 0.58 at dataset Zoo | DR = 82.35 FPR = 19.04 | Proposed approach achieved better DR and FPR metrics compared to other outlier approaches | ORCA | DR = 88.23 FPR = 13.09 | Proposed approach | DR = 94.11 FPR = 2.38 | Shannon entropy | DR = 55 FPR = 15 | Kullback–Leblier divergence | DR = 70 FPR = 15 | 2015 [22] | Extended entropy | DDoS, port scan, network scan, DoS, worm, and spam | Legitimate traffic from tsinghua University Campus network | IP flow | Netflow | Source IP address, source port, destination IP, address, destination port, flow byte, flow direction, protocol number, and TCP control bit | — | DR = 93.46 FPR = 5 | 2015 | 2017 [23] | Tsallis entropy | Real and simulated versions: DDoS, alpha flow, port scan, network scan | Real Campus network data, i.e., UTFPR/Toleda Campus and FISTSC/GW campus | IP flow | Netflow v9 | Source address, destination address, source port, destination port, number of packets, number of flows, number of bytes, in-degree | Tsallis entropy | DR = 100 FPR = 1 | Achieved better DR and FPR compared to Shannon entropy validation metrics dropped a little with sampling effects | Shannon entropy | DR = 25 FPR = 2.2806 | After incorporating sampling effects in technique | DR = 99.45 FPR = 0.12 |
|
|