Research Article
SROBR: Semantic Representation of Obfuscation-Resilient Binary Code
Table 7
Experimental results of model variants.
| ||||||||||||||||||||||||||||||||||||||
Linear+GAT removes the BERT module, and the instruction vector is embedded only by random initialization. It is used to explore the role of the BERT module. LSTM+GAT uses LSTM instead of BERT to generate the instruction embeddings in the basic block, which is used to compare the effects of BERT and RNN. BiLSTM+GAT uses BiLSTM to further explore the pros and cons of recurrent neural networks and BERT. BERT+GCN replaces the GAT module with GCN without using attention weights. |