KDTM: Multi-Stage Knowledge Distillation Transfer Model for Long-Tailed DGA Detection

Baoyu Fan, Han Ma, Yue Liu, Xiaochen Yuan, Wei Ke

研究成果: Article同行評審

1 引文 斯高帕斯(Scopus)

摘要

As the most commonly used attack strategy by Botnets, the Domain Generation Algorithm (DGA) has strong invisibility and variability. Using deep learning models to detect different families of DGA domain names can improve the network defense ability against hackers. However, this task faces an extremely imbalanced sample size among different DGA categories, which leads to low classification accuracy for small sample categories and even classification failure for some categories. To address this issue, we introduce the long-tailed concept and augment the data of small sample categories by transferring pre-trained knowledge. Firstly, we propose the Data Balanced Review Method (DBRM) to reduce the sample size difference between the categories, thus a relatively balanced dataset for transfer learning is generated. Secondly, we propose the Knowledge Transfer Model (KTM) to enhance the knowledge of the small sample categories. KTM uses a multi-stage transfer to transfer weights from the big sample categories to the small sample categories. Furthermore, we propose the Knowledge Distillation Transfer Model (KDTM) to relieve the catastrophic forgetting problem caused by transfer learning, which adds knowledge distillation loss based on the KTM. The experimental results show that KDTM can significantly improve the classification performance of all categories, especially the small sample categories. It can achieve a state-of-the-art macro average F1 score of 84.5%. The robustness of the KDTM model is verified using three DGA datasets that follow the Pareto distributions.

原文English
文章編號626
期刊Mathematics
12
發行號5
DOIs
出版狀態Published - 3月 2024

指紋

深入研究「KDTM: Multi-Stage Knowledge Distillation Transfer Model for Long-Tailed DGA Detection」主題。共同形成了獨特的指紋。

引用此