Advancing Malware Detection in Network Traffic With Self-Paced Class Incremental Learning

被引:2
|
作者
Xu, Xiaohu [1 ]
Zhang, Xixi [1 ]
Zhang, Qianyun [2 ]
Wang, Yu [1 ]
Adebisi, Bamidele [3 ]
Ohtsuki, Tomoaki [4 ]
Sari, Hikmet [1 ]
Gui, Guan [1 ]
机构
[1] Nanjing Univ Posts & Telecommun, Coll Telecommun & Informat Engn, Nanjing 210003, Peoples R China
[2] Beihang Univ, Sch Cyber Sci & Technol, Beijing 100191, Peoples R China
[3] Manchester Metropolitan Univ, Fac Sci & Engn, Dept Engn, Manchester M1 5GD, England
[4] Keio Univ, Dept Informat & Comp Sci, Yokohama 1080073, Japan
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 12期
关键词
Malware; Task analysis; Telecommunication traffic; Adaptation models; Internet of Things; Stability analysis; Data models; Class-incremental learning (CIL); deep learning; malware detection (MD); sparse loss; sparse pairwise (SP) loss; NEURAL-NETWORKS; CLASSIFICATION;
D O I
10.1109/JIOT.2024.3376635
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Ensuring network security, effective malware detection (MD) is of paramount importance. Traditional methods often struggle to accurately learn and process the characteristics of network traffic data, and must balance rapid processing with retaining memory for previously encountered malware categories as new ones emerge. To tackle these challenges, we propose a cutting-edge approach using self-paced class incremental learning (SPCIL). This method harnesses network traffic data for enhanced class incremental learning (CIL). A pivotal technique in deep learning, CIL facilitates the integration of new malware classes while preserving recognition of prior categories. The unique loss function in our SPCIL-driven MD combines sparse pairwise loss with sparse loss, striking an optimal balance between model simplicity and accuracy. Experimental results reveal that SPCIL proficiently identifies both existing and emerging malware classes, adeptly addressing catastrophic forgetting. In comparison to other incremental learning approaches, SPCIL stands out in performance and efficiency. It operates with a minimal model parameter count (8.35 million) and in increments of 2, 4, and 5, achieves impressive accuracy rates of 89.61%, 94.74%, and 97.21% respectively, underscoring its effectiveness and operational efficiency.
引用
收藏
页码:21816 / 21826
页数:11
相关论文
共 50 条
  • [41] Self-Paced Multi-Task Learning
    Li, Changsheng
    Yan, Junchi
    Wei, Fan
    Dong, Weishan
    Liu, Qingshan
    Zha, Hongyuan
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2175 - 2181
  • [42] A Self-Paced Regularization Framework for Multilabel Learning
    Li, Changsheng
    Wei, Fan
    Yan, Junchi
    Zhang, Xiaoyu
    Liu, Qingshan
    Zha, Hongyuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (06) : 2660 - 2666
  • [43] Multi-Objective Self-Paced Learning
    Li, Hao
    Gong, Maoguo
    Meng, Deyu
    Miao, Qiguang
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1802 - 1808
  • [44] Self-Paced Learning with Statistics Uncertainty Prior
    Guo, Lihua
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2018, E101D (03) : 812 - 816
  • [45] Self-Paced Learning: An Implicit Regularization Perspective
    Fan, Yanbo
    He, Ran
    Liang, Jian
    Hu, Baogang
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1877 - 1883
  • [46] Self-paced Learning for Pedestrian Trajectory Prediction
    Wu, Ya
    Li, Bin
    Zhang, Ruiqi
    Chen, Guang
    Li, Zhijun
    Liu, Zhengfa
    2022 INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS (ICARM 2022), 2022, : 781 - 786
  • [47] Self-paced contrastive learning for knowledge tracing
    Dai, Huan
    Yun, Yue
    Zhang, Yupei
    An, Rui
    Zhang, Wenxin
    Shang, Xuequn
    NEUROCOMPUTING, 2024, 609
  • [48] Self-paced hierarchical metric learning (SPHML)
    Mohammed Al-taezi
    Pengfei Zhu
    Qinghua Hu
    Yu Wang
    Abdulrahman Al-badwi
    International Journal of Machine Learning and Cybernetics, 2021, 12 : 2529 - 2541
  • [49] Self-paced hierarchical metric learning (SPHML)
    Al-taezi, Mohammed
    Zhu, Pengfei
    Hu, Qinghua
    Wang, Yu
    Al-badwi, Abdulrahman
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (09) : 2529 - 2541
  • [50] Active Clustering Ensemble With Self-Paced Learning
    Zhou, Peng
    Sun, Bicheng
    Liu, Xinwang
    Du, Liang
    Li, Xuejun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12186 - 12200