Improving Cyber Defense Against Ransomware: A Generative Adversarial Networks-Based Adversarial Training Approach for Long Short-Term Memory Network Classifier

被引:0
|
作者
Wang, Ping [1 ]
Lin, Hsiao-Chung [2 ]
Chen, Jia-Hong [1 ]
Lin, Wen-Hui [1 ]
Li, Hao-Cyuan [1 ]
机构
[1] Kun Shan Univ, Fac Dept Informat Management, Green Energy Technol Res Ctr, Tainan, Taiwan
[2] Natl Chin Yi Univ Technol, Dept Informat Management, Taichung 411030, Taiwan
来源
ELECTRONICS | 2025年 / 14卷 / 04期
关键词
deep learning models; LSTM; GAN; CW attack; adversarial examples;
D O I
10.3390/electronics14040810
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The rapid proliferation of ransomware variants necessitates more effective detection mechanisms, as traditional signature-based methods are increasingly inadequate. These conventional methods rely on manual feature extraction and matching, which are time-consuming and limited to known threats. This study addresses the escalating challenge of ransomware threats in cybersecurity by proposing a novel deep learning model, LSTM-EDadver, which leverages Generative Adversarial Networks (GANs) and Carlini and Wagner (CW) attacks to enhance malware detection capabilities. LSTM-EDadver innovatively generates adversarial examples (AEs) using sequential features derived from ransomware behaviors, thus training deep learning models to improve their robustness and accuracy. The methodology combines Cuckoo sandbox analysis with conceptual lattice ontology to capture a wide range of ransomware families and their variants. This approach not only addresses the shortcomings of existing models but also simulates real-world adversarial conditions during the validation phase by subjecting the models to CW attacks. The experimental results demonstrate that LSTM-EDadver achieves a classification accuracy of 96.59%. This performance was achieved using a dataset of 1328 ransomware samples (across 32 ransomware families) and 519 normal instances, outperforming traditional RNN, LSTM, and GCU models, which recorded accuracies of 90.01%, 93.95%, and 94.53%, respectively. The proposed model also shows significant improvements in F1-score, ranging from 2.49% to 6.64% compared to existing models without adversarial training. This advancement underscores the effectiveness of integrating GAN-generated attack command sequences into model training.
引用
收藏
页数:25
相关论文
共 50 条
  • [41] Integrated adversarial long short-term memory deep networks for reheater tube temperature forecasting of ultra-supercritical turbo-generators
    Yin L.
    Wei X.
    Applied Soft Computing, 2023, 142
  • [42] Partial Discharge Detection Based on Long Short-Term Memory Neural Network Classifier with Efficient Feature Extraction Methods
    Xu, Ning
    Gooi, Hoay Beng
    Wang, Lipo
    Zheng, Yuanjin
    Yang, Jiawei
    2021 IEEE 12TH ENERGY CONVERSION CONGRESS AND EXPOSITION - ASIA (ECCE ASIA), 2021, : 2328 - 2333
  • [43] Effective Internet of Things botnet classification by data upsampling using generative adversarial network and scale fused bidirectional long short term memory attention model
    Geetha, K.
    Brahmananda, S. H.
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2022, 34 (28):
  • [44] A Novel Virtual Network Fault Diagnosis Method Based on Long Short-Term Memory Neural Networks
    Zhang, Lei
    Zhu, Xiaorong
    Zhao, Su
    Xu, Ding
    2017 IEEE 86TH VEHICULAR TECHNOLOGY CONFERENCE (VTC-FALL), 2017,
  • [45] What goes around comes around: Cycle-Consistency-based Short-Term Motion Prediction for Anomaly Detection using Generative Adversarial Networks
    Golda, Thomas
    Murzyn, Nils
    Qu, Chengchao
    Kroschel, Kristian
    2019 16TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), 2019,
  • [46] Two-Stream Convolutional Network for Improving Activity Recognition Using Convolutional Long Short-Term Memory Networks
    Ye, W.
    Cheng, J.
    Yang, F.
    Xu, Y.
    IEEE ACCESS, 2019, 7 : 67772 - 67780
  • [47] A novel approach to on-line handwriting recognition based on bidirectional long short-term memory networks
    Liwicki, Marcus
    Graves, Alex
    Bunke, Horst
    Schmidhuber, Juergyen
    ICDAR 2007: NINTH INTERNATIONAL CONFERENCE ON DOCUMENT ANALYSIS AND RECOGNITION, VOLS I AND II, PROCEEDINGS, 2007, : 367 - +
  • [48] Hybrid millimetre-wave channel simulation approach based on long short-term memory networks
    Fu, Zihao
    Zhang, Yu
    Shi, Ruibo
    Zhao, Xiongwen
    Du, Fei
    Geng, Suiyan
    IET COMMUNICATIONS, 2022, 16 (16) : 1923 - 1933
  • [49] An Ontology-Based Latent Semantic Indexing Approach Using Long Short-Term Memory Networks
    Ma, Ningning
    Zheng, Hai-Tao
    Xiao, Xi
    WEB AND BIG DATA, APWEB-WAIM 2017, PT I, 2017, 10366 : 185 - 199
  • [50] Network attacks classification using Long Short-term memory based neural networks in Software-Defined Networks
    Volkov, S. S.
    Kurochkin, I. I.
    9TH INTERNATIONAL YOUNG SCIENTISTS CONFERENCE IN COMPUTATIONAL SCIENCE, YSC2020, 2020, 178 : 394 - 403