Advancing Malware Detection in Network Traffic With Self-Paced Class Incremental Learning

被引:2
|
作者
Xu, Xiaohu [1 ]
Zhang, Xixi [1 ]
Zhang, Qianyun [2 ]
Wang, Yu [1 ]
Adebisi, Bamidele [3 ]
Ohtsuki, Tomoaki [4 ]
Sari, Hikmet [1 ]
Gui, Guan [1 ]
机构
[1] Nanjing Univ Posts & Telecommun, Coll Telecommun & Informat Engn, Nanjing 210003, Peoples R China
[2] Beihang Univ, Sch Cyber Sci & Technol, Beijing 100191, Peoples R China
[3] Manchester Metropolitan Univ, Fac Sci & Engn, Dept Engn, Manchester M1 5GD, England
[4] Keio Univ, Dept Informat & Comp Sci, Yokohama 1080073, Japan
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 12期
关键词
Malware; Task analysis; Telecommunication traffic; Adaptation models; Internet of Things; Stability analysis; Data models; Class-incremental learning (CIL); deep learning; malware detection (MD); sparse loss; sparse pairwise (SP) loss; NEURAL-NETWORKS; CLASSIFICATION;
D O I
10.1109/JIOT.2024.3376635
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Ensuring network security, effective malware detection (MD) is of paramount importance. Traditional methods often struggle to accurately learn and process the characteristics of network traffic data, and must balance rapid processing with retaining memory for previously encountered malware categories as new ones emerge. To tackle these challenges, we propose a cutting-edge approach using self-paced class incremental learning (SPCIL). This method harnesses network traffic data for enhanced class incremental learning (CIL). A pivotal technique in deep learning, CIL facilitates the integration of new malware classes while preserving recognition of prior categories. The unique loss function in our SPCIL-driven MD combines sparse pairwise loss with sparse loss, striking an optimal balance between model simplicity and accuracy. Experimental results reveal that SPCIL proficiently identifies both existing and emerging malware classes, adeptly addressing catastrophic forgetting. In comparison to other incremental learning approaches, SPCIL stands out in performance and efficiency. It operates with a minimal model parameter count (8.35 million) and in increments of 2, 4, and 5, achieves impressive accuracy rates of 89.61%, 94.74%, and 97.21% respectively, underscoring its effectiveness and operational efficiency.
引用
收藏
页码:21816 / 21826
页数:11
相关论文
共 50 条
  • [31] Robust Softmax Regression for Multi-class Classification with Self-Paced Learning
    Ren, Yazhou
    Zhao, Peng
    Sheng, Yongpan
    Yao, Dezhong
    Xu, Zenglin
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2641 - 2647
  • [32] Advancing Brain Imaging Analysis Step-by-Step via Progressive Self-paced Learning
    Yang, Yanwu
    Chen, Hairui
    Hu, Jiesi
    Guo, Xutao
    Ma, Ting
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2024, PT XI, 2024, 15011 : 58 - 68
  • [33] The Efficacy of Self-Paced Study in Multitrial Learning
    de Jonge, Mario
    Tabbers, Huib K.
    Pecher, Diane
    Jang, Yoonhee
    Zeelenberg, Rene
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-LEARNING MEMORY AND COGNITION, 2015, 41 (03) : 851 - 858
  • [34] Self-paced deep clustering with learning loss
    Zhang, Kai
    Song, Chengyun
    Qiu, Lianpeng
    PATTERN RECOGNITION LETTERS, 2023, 171 : 8 - 14
  • [35] Self-Paced Weight Consolidation for Continual Learning
    Cong, Wei
    Cong, Yang
    Sun, Gan
    Liu, Yuyang
    Dong, Jiahua
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (04) : 2209 - 2222
  • [36] MATHEMATICA BASED PLATFORM FOR SELF-PACED LEARNING
    Zinder, Y.
    Nicorovici, N.
    Langtry, T.
    EDULEARN10: INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2010, : 323 - 330
  • [37] Contextualization of Learning Objects for Self-Paced Learning Environments
    Bodendorf, Freimut
    Goetzelt, Kai-Uwe
    PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE ON SYSTEMS (ICONS 2011), 2011, : 157 - 160
  • [38] Balanced Self-Paced Learning with Feature Corruption
    Ren, Yazhou
    Zhao, Peng
    Xu, Zenglin
    Yao, Dezhong
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2064 - 2071
  • [39] Self-Paced Learning for Neural Machine Translation
    Wan, Yu
    Yang, Baosong
    Wong, Derek F.
    Zhou, Yikai
    Chao, Lidia S.
    Zhang, Haibo
    Chen, Boxing
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1074 - 1080
  • [40] Self-Paced Multitask Learning with Shared Knowledge
    Murugesan, Keerthiram
    Carbonell, Jaime
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2522 - 2528